sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
ce17bce26d45559cda4d37267dc4475365901d78
# Dataset Card for Evaluation run of TheBloke/guanaco-13B-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/guanaco-13B-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/guanaco-13B-HF](https://huggingface.co/TheBloke/guanaco-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__guanaco-13B-HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T02:23:34.396726](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__guanaco-13B-HF/blob/main/results_2023-10-23T02-23-34.396726.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.003984899328859061, "em_stderr": 0.0006451805848102414, "f1": 0.06359479865771825, "f1_stderr": 0.001462243147092022, "acc": 0.422835936195714, "acc_stderr": 0.009899837599397724 }, "harness|drop|3": { "em": 0.003984899328859061, "em_stderr": 0.0006451805848102414, "f1": 0.06359479865771825, "f1_stderr": 0.001462243147092022 }, "harness|gsm8k|5": { "acc": 0.08718726307808947, "acc_stderr": 0.007770691416783571 }, "harness|winogrande|5": { "acc": 0.7584846093133386, "acc_stderr": 0.012028983782011875 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__guanaco-13B-HF
[ "region:us" ]
2023-08-18T10:28:11+00:00
{"pretty_name": "Evaluation run of TheBloke/guanaco-13B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/guanaco-13B-HF](https://huggingface.co/TheBloke/guanaco-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__guanaco-13B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:23:34.396726](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__guanaco-13B-HF/blob/main/results_2023-10-23T02-23-34.396726.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003984899328859061,\n \"em_stderr\": 0.0006451805848102414,\n \"f1\": 0.06359479865771825,\n \"f1_stderr\": 0.001462243147092022,\n \"acc\": 0.422835936195714,\n \"acc_stderr\": 0.009899837599397724\n },\n \"harness|drop|3\": {\n \"em\": 0.003984899328859061,\n \"em_stderr\": 0.0006451805848102414,\n \"f1\": 0.06359479865771825,\n \"f1_stderr\": 0.001462243147092022\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08718726307808947,\n \"acc_stderr\": 0.007770691416783571\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011875\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/guanaco-13B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T02_23_34.396726", "path": ["**/details_harness|drop|3_2023-10-23T02-23-34.396726.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-23-34.396726.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T02_23_34.396726", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-23-34.396726.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-23-34.396726.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:24:37.744515.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:24:37.744515.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:24:37.744515.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T02_23_34.396726", "path": ["**/details_harness|winogrande|5_2023-10-23T02-23-34.396726.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-23-34.396726.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_24_37.744515", "path": ["results_2023-07-19T19:24:37.744515.parquet"]}, {"split": "2023_10_23T02_23_34.396726", "path": ["results_2023-10-23T02-23-34.396726.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-23-34.396726.parquet"]}]}]}
2023-10-23T01:23:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/guanaco-13B-HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/guanaco-13B-HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T02:23:34.396726(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/guanaco-13B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T02:23:34.396726(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/guanaco-13B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T02:23:34.396726(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/guanaco-13B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:23:34.396726(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cd5daa75e42dddbff78b0d5aa6ffe2c42981e2b8
# Dataset Card for Evaluation run of TheBloke/robin-33B-v2-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/robin-33B-v2-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/robin-33B-v2-fp16](https://huggingface.co/TheBloke/robin-33B-v2-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__robin-33B-v2-fp16", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-31T16:41:32.452325](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-33B-v2-fp16/blob/main/results_2023-07-31T16%3A41%3A32.452325.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5493694357469432, "acc_stderr": 0.03462857618448208, "acc_norm": 0.5533043005336739, "acc_norm_stderr": 0.03460642548466365, "mc1": 0.3574051407588739, "mc1_stderr": 0.016776599676729398, "mc2": 0.5388029530988832, "mc2_stderr": 0.014742138833066059 }, "harness|arc:challenge|25": { "acc": 0.5947098976109215, "acc_stderr": 0.014346869060229321, "acc_norm": 0.6237201365187713, "acc_norm_stderr": 0.014157022555407156 }, "harness|hellaswag|10": { "acc": 0.6331408086038638, "acc_stderr": 0.004809626723626824, "acc_norm": 0.8362875921131249, "acc_norm_stderr": 0.0036925819391622834 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.04033565667848319, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5207547169811321, "acc_stderr": 0.030746349975723463, "acc_norm": 0.5207547169811321, "acc_norm_stderr": 0.030746349975723463 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.041553199555931467, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.041553199555931467 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.48554913294797686, "acc_stderr": 0.03810871630454764, "acc_norm": 0.48554913294797686, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3137254901960784, "acc_stderr": 0.04617034827006718, "acc_norm": 0.3137254901960784, "acc_norm_stderr": 0.04617034827006718 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.451063829787234, "acc_stderr": 0.032529096196131965, "acc_norm": 0.451063829787234, "acc_norm_stderr": 0.032529096196131965 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3684210526315789, "acc_stderr": 0.04537815354939392, "acc_norm": 0.3684210526315789, "acc_norm_stderr": 0.04537815354939392 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.0416180850350153, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.328042328042328, "acc_stderr": 0.024180497164376896, "acc_norm": 0.328042328042328, "acc_norm_stderr": 0.024180497164376896 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.31746031746031744, "acc_stderr": 0.04163453031302859, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.04163453031302859 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6161290322580645, "acc_stderr": 0.027666182075539638, "acc_norm": 0.6161290322580645, "acc_norm_stderr": 0.027666182075539638 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.37438423645320196, "acc_stderr": 0.03405155380561953, "acc_norm": 0.37438423645320196, "acc_norm_stderr": 0.03405155380561953 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885416, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885416 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03191178226713547, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03191178226713547 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.772020725388601, "acc_stderr": 0.030276909945178267, "acc_norm": 0.772020725388601, "acc_norm_stderr": 0.030276909945178267 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5025641025641026, "acc_stderr": 0.025350672979412202, "acc_norm": 0.5025641025641026, "acc_norm_stderr": 0.025350672979412202 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5630252100840336, "acc_stderr": 0.03221943636566196, "acc_norm": 0.5630252100840336, "acc_norm_stderr": 0.03221943636566196 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7339449541284404, "acc_stderr": 0.018946022322225607, "acc_norm": 0.7339449541284404, "acc_norm_stderr": 0.018946022322225607 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4212962962962963, "acc_stderr": 0.03367462138896078, "acc_norm": 0.4212962962962963, "acc_norm_stderr": 0.03367462138896078 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591361, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6188340807174888, "acc_stderr": 0.03259625118416827, "acc_norm": 0.6188340807174888, "acc_norm_stderr": 0.03259625118416827 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7024793388429752, "acc_stderr": 0.04173349148083499, "acc_norm": 0.7024793388429752, "acc_norm_stderr": 0.04173349148083499 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.036230899157241474, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.036230899157241474 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3392857142857143, "acc_stderr": 0.04493949068613539, "acc_norm": 0.3392857142857143, "acc_norm_stderr": 0.04493949068613539 }, "harness|hendrycksTest-management|5": { "acc": 0.6310679611650486, "acc_stderr": 0.0477761518115674, "acc_norm": 0.6310679611650486, "acc_norm_stderr": 0.0477761518115674 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560396, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560396 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7305236270753512, "acc_stderr": 0.01586624307321506, "acc_norm": 0.7305236270753512, "acc_norm_stderr": 0.01586624307321506 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5953757225433526, "acc_stderr": 0.02642481659400985, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.02642481659400985 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.26033519553072626, "acc_stderr": 0.014676252009319476, "acc_norm": 0.26033519553072626, "acc_norm_stderr": 0.014676252009319476 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5882352941176471, "acc_stderr": 0.02818059632825929, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.02818059632825929 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6302250803858521, "acc_stderr": 0.027417996705630998, "acc_norm": 0.6302250803858521, "acc_norm_stderr": 0.027417996705630998 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6080246913580247, "acc_stderr": 0.027163686038271146, "acc_norm": 0.6080246913580247, "acc_norm_stderr": 0.027163686038271146 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.02958345203628407, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.02958345203628407 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.41264667535853977, "acc_stderr": 0.012573836633799015, "acc_norm": 0.41264667535853977, "acc_norm_stderr": 0.012573836633799015 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5404411764705882, "acc_stderr": 0.03027332507734575, "acc_norm": 0.5404411764705882, "acc_norm_stderr": 0.03027332507734575 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5571895424836601, "acc_stderr": 0.020095083154577347, "acc_norm": 0.5571895424836601, "acc_norm_stderr": 0.020095083154577347 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5714285714285714, "acc_stderr": 0.031680911612338825, "acc_norm": 0.5714285714285714, "acc_norm_stderr": 0.031680911612338825 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7661691542288557, "acc_stderr": 0.02992941540834839, "acc_norm": 0.7661691542288557, "acc_norm_stderr": 0.02992941540834839 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.03889951252827217, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.03889951252827217 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.032744852119469564, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.032744852119469564 }, "harness|truthfulqa:mc|0": { "mc1": 0.3574051407588739, "mc1_stderr": 0.016776599676729398, "mc2": 0.5388029530988832, "mc2_stderr": 0.014742138833066059 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__robin-33B-v2-fp16
[ "region:us" ]
2023-08-18T10:28:19+00:00
{"pretty_name": "Evaluation run of TheBloke/robin-33B-v2-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/robin-33B-v2-fp16](https://huggingface.co/TheBloke/robin-33B-v2-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__robin-33B-v2-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-31T16:41:32.452325](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-33B-v2-fp16/blob/main/results_2023-07-31T16%3A41%3A32.452325.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5493694357469432,\n \"acc_stderr\": 0.03462857618448208,\n \"acc_norm\": 0.5533043005336739,\n \"acc_norm_stderr\": 0.03460642548466365,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.016776599676729398,\n \"mc2\": 0.5388029530988832,\n \"mc2_stderr\": 0.014742138833066059\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.014346869060229321,\n \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407156\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6331408086038638,\n \"acc_stderr\": 0.004809626723626824,\n \"acc_norm\": 0.8362875921131249,\n \"acc_norm_stderr\": 0.0036925819391622834\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n \"acc_stderr\": 0.027666182075539638,\n \"acc_norm\": 0.6161290322580645,\n \"acc_norm_stderr\": 0.027666182075539638\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561953,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561953\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178267,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178267\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412202,\n \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412202\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7339449541284404,\n \"acc_stderr\": 0.018946022322225607,\n \"acc_norm\": 0.7339449541284404,\n \"acc_norm_stderr\": 0.018946022322225607\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7305236270753512,\n \"acc_stderr\": 0.01586624307321506,\n \"acc_norm\": 0.7305236270753512,\n \"acc_norm_stderr\": 0.01586624307321506\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.02642481659400985,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.02642481659400985\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26033519553072626,\n \"acc_stderr\": 0.014676252009319476,\n \"acc_norm\": 0.26033519553072626,\n \"acc_norm_stderr\": 0.014676252009319476\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825929,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825929\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n \"acc_stderr\": 0.012573836633799015,\n \"acc_norm\": 0.41264667535853977,\n \"acc_norm_stderr\": 0.012573836633799015\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734575,\n \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734575\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5571895424836601,\n \"acc_stderr\": 0.020095083154577347,\n \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.020095083154577347\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.7661691542288557,\n \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.016776599676729398,\n \"mc2\": 0.5388029530988832,\n \"mc2_stderr\": 0.014742138833066059\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/robin-33B-v2-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:41:32.452325.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:41:32.452325.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T16_41_32.452325", "path": ["results_2023-07-31T16:41:32.452325.parquet"]}, {"split": "latest", "path": ["results_2023-07-31T16:41:32.452325.parquet"]}]}]}
2023-08-27T11:34:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/robin-33B-v2-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/robin-33B-v2-fp16 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-31T16:41:32.452325 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/robin-33B-v2-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/robin-33B-v2-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-31T16:41:32.452325 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/robin-33B-v2-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/robin-33B-v2-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-31T16:41:32.452325 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/robin-33B-v2-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/robin-33B-v2-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-31T16:41:32.452325 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f4d08599c461c55a950181aebb48747e4fb2c877
# Dataset Card for Evaluation run of TheBloke/koala-13B-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/koala-13B-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/koala-13B-HF](https://huggingface.co/TheBloke/koala-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__koala-13B-HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T08:43:38.346498](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-13B-HF/blob/main/results_2023-10-22T08-43-38.346498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.021707214765100673, "em_stderr": 0.0014923686874006184, "f1": 0.09106753355704705, "f1_stderr": 0.0020580604985252385, "acc": 0.40428250097386687, "acc_stderr": 0.009634029824810052 }, "harness|drop|3": { "em": 0.021707214765100673, "em_stderr": 0.0014923686874006184, "f1": 0.09106753355704705, "f1_stderr": 0.0020580604985252385 }, "harness|gsm8k|5": { "acc": 0.06823351023502654, "acc_stderr": 0.006945358944067431 }, "harness|winogrande|5": { "acc": 0.7403314917127072, "acc_stderr": 0.012322700705552673 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__koala-13B-HF
[ "region:us" ]
2023-08-18T10:28:27+00:00
{"pretty_name": "Evaluation run of TheBloke/koala-13B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/koala-13B-HF](https://huggingface.co/TheBloke/koala-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__koala-13B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T08:43:38.346498](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-13B-HF/blob/main/results_2023-10-22T08-43-38.346498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.021707214765100673,\n \"em_stderr\": 0.0014923686874006184,\n \"f1\": 0.09106753355704705,\n \"f1_stderr\": 0.0020580604985252385,\n \"acc\": 0.40428250097386687,\n \"acc_stderr\": 0.009634029824810052\n },\n \"harness|drop|3\": {\n \"em\": 0.021707214765100673,\n \"em_stderr\": 0.0014923686874006184,\n \"f1\": 0.09106753355704705,\n \"f1_stderr\": 0.0020580604985252385\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552673\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/koala-13B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T08_43_38.346498", "path": ["**/details_harness|drop|3_2023-10-22T08-43-38.346498.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T08-43-38.346498.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T08_43_38.346498", "path": ["**/details_harness|gsm8k|5_2023-10-22T08-43-38.346498.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T08-43-38.346498.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:49:04.838102.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:49:04.838102.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T08_43_38.346498", "path": ["**/details_harness|winogrande|5_2023-10-22T08-43-38.346498.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T08-43-38.346498.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_49_04.838102", "path": ["results_2023-07-19T18:49:04.838102.parquet"]}, {"split": "2023_10_22T08_43_38.346498", "path": ["results_2023-10-22T08-43-38.346498.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T08-43-38.346498.parquet"]}]}]}
2023-10-22T07:43:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/koala-13B-HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/koala-13B-HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T08:43:38.346498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/koala-13B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/koala-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T08:43:38.346498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/koala-13B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/koala-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T08:43:38.346498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/koala-13B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/koala-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T08:43:38.346498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7872dc01f555bab1946240741af10bc9c46ab29e
# Dataset Card for Evaluation run of TheBloke/robin-13B-v2-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/robin-13B-v2-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/robin-13B-v2-fp16](https://huggingface.co/TheBloke/robin-13B-v2-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__robin-13B-v2-fp16", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-31T15:48:06.598529](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-13B-v2-fp16/blob/main/results_2023-07-31T15%3A48%3A06.598529.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.49056004249413854, "acc_stderr": 0.034895228964178376, "acc_norm": 0.49452555601900244, "acc_norm_stderr": 0.03487806793899599, "mc1": 0.34149326805385555, "mc1_stderr": 0.016600688619950826, "mc2": 0.5063100731922137, "mc2_stderr": 0.014760623429029368 }, "harness|arc:challenge|25": { "acc": 0.5401023890784983, "acc_stderr": 0.01456431885692485, "acc_norm": 0.5648464163822525, "acc_norm_stderr": 0.014487986197186045 }, "harness|hellaswag|10": { "acc": 0.5945030870344553, "acc_stderr": 0.004899845087183104, "acc_norm": 0.8037243576976698, "acc_norm_stderr": 0.003963677261161229 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4666666666666667, "acc_stderr": 0.043097329010363554, "acc_norm": 0.4666666666666667, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4868421052631579, "acc_stderr": 0.04067533136309173, "acc_norm": 0.4868421052631579, "acc_norm_stderr": 0.04067533136309173 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4679245283018868, "acc_stderr": 0.03070948699255655, "acc_norm": 0.4679245283018868, "acc_norm_stderr": 0.03070948699255655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4722222222222222, "acc_stderr": 0.04174752578923185, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.04174752578923185 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909284, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117317, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117317 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.44508670520231214, "acc_stderr": 0.03789401760283646, "acc_norm": 0.44508670520231214, "acc_norm_stderr": 0.03789401760283646 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.17647058823529413, "acc_stderr": 0.0379328118530781, "acc_norm": 0.17647058823529413, "acc_norm_stderr": 0.0379328118530781 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4, "acc_stderr": 0.03202563076101735, "acc_norm": 0.4, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4068965517241379, "acc_stderr": 0.04093793981266237, "acc_norm": 0.4068965517241379, "acc_norm_stderr": 0.04093793981266237 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.02256989707491841, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.02256989707491841 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.31746031746031744, "acc_stderr": 0.04163453031302859, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.04163453031302859 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.49032258064516127, "acc_stderr": 0.028438677998909558, "acc_norm": 0.49032258064516127, "acc_norm_stderr": 0.028438677998909558 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.32019704433497537, "acc_stderr": 0.032826493853041504, "acc_norm": 0.32019704433497537, "acc_norm_stderr": 0.032826493853041504 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6303030303030303, "acc_stderr": 0.037694303145125674, "acc_norm": 0.6303030303030303, "acc_norm_stderr": 0.037694303145125674 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5606060606060606, "acc_stderr": 0.03536085947529479, "acc_norm": 0.5606060606060606, "acc_norm_stderr": 0.03536085947529479 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6683937823834197, "acc_stderr": 0.03397636541089118, "acc_norm": 0.6683937823834197, "acc_norm_stderr": 0.03397636541089118 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.44871794871794873, "acc_stderr": 0.025217315184846482, "acc_norm": 0.44871794871794873, "acc_norm_stderr": 0.025217315184846482 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23333333333333334, "acc_stderr": 0.02578787422095932, "acc_norm": 0.23333333333333334, "acc_norm_stderr": 0.02578787422095932 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.0322529423239964, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.0322529423239964 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6605504587155964, "acc_stderr": 0.02030210934266235, "acc_norm": 0.6605504587155964, "acc_norm_stderr": 0.02030210934266235 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.30092592592592593, "acc_stderr": 0.03128039084329882, "acc_norm": 0.30092592592592593, "acc_norm_stderr": 0.03128039084329882 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6274509803921569, "acc_stderr": 0.03393388584958404, "acc_norm": 0.6274509803921569, "acc_norm_stderr": 0.03393388584958404 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842544, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842544 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5695067264573991, "acc_stderr": 0.033231973029429394, "acc_norm": 0.5695067264573991, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7024793388429752, "acc_stderr": 0.04173349148083499, "acc_norm": 0.7024793388429752, "acc_norm_stderr": 0.04173349148083499 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5740740740740741, "acc_stderr": 0.0478034362693679, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.0478034362693679 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5828220858895705, "acc_stderr": 0.03874102859818081, "acc_norm": 0.5828220858895705, "acc_norm_stderr": 0.03874102859818081 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489122, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489122 }, "harness|hendrycksTest-management|5": { "acc": 0.6407766990291263, "acc_stderr": 0.047504583990416946, "acc_norm": 0.6407766990291263, "acc_norm_stderr": 0.047504583990416946 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7521367521367521, "acc_stderr": 0.0282863240755644, "acc_norm": 0.7521367521367521, "acc_norm_stderr": 0.0282863240755644 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6883780332056194, "acc_stderr": 0.016562433867284176, "acc_norm": 0.6883780332056194, "acc_norm_stderr": 0.016562433867284176 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5, "acc_stderr": 0.026919095102908273, "acc_norm": 0.5, "acc_norm_stderr": 0.026919095102908273 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25027932960893856, "acc_stderr": 0.01448750085285041, "acc_norm": 0.25027932960893856, "acc_norm_stderr": 0.01448750085285041 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5065359477124183, "acc_stderr": 0.028627470550556047, "acc_norm": 0.5065359477124183, "acc_norm_stderr": 0.028627470550556047 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5337620578778135, "acc_stderr": 0.028333277109562786, "acc_norm": 0.5337620578778135, "acc_norm_stderr": 0.028333277109562786 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5524691358024691, "acc_stderr": 0.02766713856942271, "acc_norm": 0.5524691358024691, "acc_norm_stderr": 0.02766713856942271 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.37943262411347517, "acc_stderr": 0.028947338851614105, "acc_norm": 0.37943262411347517, "acc_norm_stderr": 0.028947338851614105 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4211212516297262, "acc_stderr": 0.012610325733489903, "acc_norm": 0.4211212516297262, "acc_norm_stderr": 0.012610325733489903 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5147058823529411, "acc_stderr": 0.03035969707904612, "acc_norm": 0.5147058823529411, "acc_norm_stderr": 0.03035969707904612 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.48366013071895425, "acc_stderr": 0.020217030653186453, "acc_norm": 0.48366013071895425, "acc_norm_stderr": 0.020217030653186453 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5636363636363636, "acc_stderr": 0.04750185058907296, "acc_norm": 0.5636363636363636, "acc_norm_stderr": 0.04750185058907296 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5551020408163265, "acc_stderr": 0.031814251181977865, "acc_norm": 0.5551020408163265, "acc_norm_stderr": 0.031814251181977865 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6567164179104478, "acc_stderr": 0.03357379665433431, "acc_norm": 0.6567164179104478, "acc_norm_stderr": 0.03357379665433431 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.78, "acc_stderr": 0.04163331998932264, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932264 }, "harness|hendrycksTest-virology|5": { "acc": 0.4578313253012048, "acc_stderr": 0.038786267710023595, "acc_norm": 0.4578313253012048, "acc_norm_stderr": 0.038786267710023595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.0352821125824523, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.0352821125824523 }, "harness|truthfulqa:mc|0": { "mc1": 0.34149326805385555, "mc1_stderr": 0.016600688619950826, "mc2": 0.5063100731922137, "mc2_stderr": 0.014760623429029368 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__robin-13B-v2-fp16
[ "region:us" ]
2023-08-18T10:28:36+00:00
{"pretty_name": "Evaluation run of TheBloke/robin-13B-v2-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/robin-13B-v2-fp16](https://huggingface.co/TheBloke/robin-13B-v2-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__robin-13B-v2-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-31T15:48:06.598529](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-13B-v2-fp16/blob/main/results_2023-07-31T15%3A48%3A06.598529.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49056004249413854,\n \"acc_stderr\": 0.034895228964178376,\n \"acc_norm\": 0.49452555601900244,\n \"acc_norm_stderr\": 0.03487806793899599,\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5063100731922137,\n \"mc2_stderr\": 0.014760623429029368\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.01456431885692485,\n \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186045\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5945030870344553,\n \"acc_stderr\": 0.004899845087183104,\n \"acc_norm\": 0.8037243576976698,\n \"acc_norm_stderr\": 0.003963677261161229\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.03070948699255655,\n \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.03070948699255655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117317,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117317\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.03789401760283646,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.03789401760283646\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266237,\n \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266237\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.037694303145125674,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.037694303145125674\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5606060606060606,\n \"acc_stderr\": 0.03536085947529479,\n \"acc_norm\": 0.5606060606060606,\n \"acc_norm_stderr\": 0.03536085947529479\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095932,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095932\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6605504587155964,\n \"acc_stderr\": 0.02030210934266235,\n \"acc_norm\": 0.6605504587155964,\n \"acc_norm_stderr\": 0.02030210934266235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329882,\n \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329882\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.03393388584958404,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.03393388584958404\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842544,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842544\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.03874102859818081,\n \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.03874102859818081\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.7521367521367521,\n \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6883780332056194,\n \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.6883780332056194,\n \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.01448750085285041,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.01448750085285041\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556047,\n \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556047\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5337620578778135,\n \"acc_stderr\": 0.028333277109562786,\n \"acc_norm\": 0.5337620578778135,\n \"acc_norm_stderr\": 0.028333277109562786\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.02766713856942271,\n \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.02766713856942271\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489903,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489903\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186453,\n \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186453\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.5063100731922137,\n \"mc2_stderr\": 0.014760623429029368\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/robin-13B-v2-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:48:06.598529.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:48:06.598529.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T15_48_06.598529", "path": ["results_2023-07-31T15:48:06.598529.parquet"]}, {"split": "latest", "path": ["results_2023-07-31T15:48:06.598529.parquet"]}]}]}
2023-08-27T11:34:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/robin-13B-v2-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/robin-13B-v2-fp16 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-31T15:48:06.598529 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/robin-13B-v2-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/robin-13B-v2-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-31T15:48:06.598529 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/robin-13B-v2-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/robin-13B-v2-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-31T15:48:06.598529 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/robin-13B-v2-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/robin-13B-v2-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-31T15:48:06.598529 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
86cc80ae3a76b21f6585a8c1a2b8e13bcd5b05bd
# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora_mlp-65B-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/gpt4-alpaca-lora_mlp-65B-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/gpt4-alpaca-lora_mlp-65B-HF](https://huggingface.co/TheBloke/gpt4-alpaca-lora_mlp-65B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora_mlp-65B-HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T07:45:08.272902](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora_mlp-65B-HF/blob/main/results_2023-10-23T07-45-08.272902.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.015625, "em_stderr": 0.0012700767094662763, "f1": 0.09636115771812082, "f1_stderr": 0.0019819425315034905, "acc": 0.5447099133363212, "acc_stderr": 0.011752408531897077 }, "harness|drop|3": { "em": 0.015625, "em_stderr": 0.0012700767094662763, "f1": 0.09636115771812082, "f1_stderr": 0.0019819425315034905 }, "harness|gsm8k|5": { "acc": 0.28278999241849884, "acc_stderr": 0.01240502041787362 }, "harness|winogrande|5": { "acc": 0.8066298342541437, "acc_stderr": 0.011099796645920533 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora_mlp-65B-HF
[ "region:us" ]
2023-08-18T10:28:44+00:00
{"pretty_name": "Evaluation run of TheBloke/gpt4-alpaca-lora_mlp-65B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/gpt4-alpaca-lora_mlp-65B-HF](https://huggingface.co/TheBloke/gpt4-alpaca-lora_mlp-65B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora_mlp-65B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T07:45:08.272902](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora_mlp-65B-HF/blob/main/results_2023-10-23T07-45-08.272902.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.015625,\n \"em_stderr\": 0.0012700767094662763,\n \"f1\": 0.09636115771812082,\n \"f1_stderr\": 0.0019819425315034905,\n \"acc\": 0.5447099133363212,\n \"acc_stderr\": 0.011752408531897077\n },\n \"harness|drop|3\": {\n \"em\": 0.015625,\n \"em_stderr\": 0.0012700767094662763,\n \"f1\": 0.09636115771812082,\n \"f1_stderr\": 0.0019819425315034905\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28278999241849884,\n \"acc_stderr\": 0.01240502041787362\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920533\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/gpt4-alpaca-lora_mlp-65B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T07_45_08.272902", "path": ["**/details_harness|drop|3_2023-10-23T07-45-08.272902.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T07-45-08.272902.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T07_45_08.272902", "path": ["**/details_harness|gsm8k|5_2023-10-23T07-45-08.272902.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T07-45-08.272902.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:53:38.948593.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:53:38.948593.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:53:38.948593.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T07_45_08.272902", "path": ["**/details_harness|winogrande|5_2023-10-23T07-45-08.272902.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T07-45-08.272902.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T19_53_38.948593", "path": ["results_2023-07-25T19:53:38.948593.parquet"]}, {"split": "2023_10_23T07_45_08.272902", "path": ["results_2023-10-23T07-45-08.272902.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T07-45-08.272902.parquet"]}]}]}
2023-10-23T06:45:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora_mlp-65B-HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora_mlp-65B-HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T07:45:08.272902(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora_mlp-65B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora_mlp-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T07:45:08.272902(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora_mlp-65B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora_mlp-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T07:45:08.272902(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora_mlp-65B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora_mlp-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T07:45:08.272902(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c0511838db265d5e393dd49774956f438bdc6a8e
# Dataset Card for Evaluation run of TheBloke/tulu-7B-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/tulu-7B-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/tulu-7B-fp16](https://huggingface.co/TheBloke/tulu-7B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__tulu-7B-fp16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T23:41:54.207641](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__tulu-7B-fp16/blob/main/results_2023-10-22T23-41-54.207641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2993917785234899, "em_stderr": 0.004690263056389047, "f1": 0.33736996644295303, "f1_stderr": 0.004651138439477223, "acc": 0.42508495529786566, "acc_stderr": 0.010526343784939971 }, "harness|drop|3": { "em": 0.2993917785234899, "em_stderr": 0.004690263056389047, "f1": 0.33736996644295303, "f1_stderr": 0.004651138439477223 }, "harness|gsm8k|5": { "acc": 0.11220621683093253, "acc_stderr": 0.008693743138242383 }, "harness|winogrande|5": { "acc": 0.7379636937647988, "acc_stderr": 0.012358944431637561 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__tulu-7B-fp16
[ "region:us" ]
2023-08-18T10:28:53+00:00
{"pretty_name": "Evaluation run of TheBloke/tulu-7B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/tulu-7B-fp16](https://huggingface.co/TheBloke/tulu-7B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__tulu-7B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T23:41:54.207641](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__tulu-7B-fp16/blob/main/results_2023-10-22T23-41-54.207641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2993917785234899,\n \"em_stderr\": 0.004690263056389047,\n \"f1\": 0.33736996644295303,\n \"f1_stderr\": 0.004651138439477223,\n \"acc\": 0.42508495529786566,\n \"acc_stderr\": 0.010526343784939971\n },\n \"harness|drop|3\": {\n \"em\": 0.2993917785234899,\n \"em_stderr\": 0.004690263056389047,\n \"f1\": 0.33736996644295303,\n \"f1_stderr\": 0.004651138439477223\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11220621683093253,\n \"acc_stderr\": 0.008693743138242383\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637561\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/tulu-7B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T23_41_54.207641", "path": ["**/details_harness|drop|3_2023-10-22T23-41-54.207641.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T23-41-54.207641.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T23_41_54.207641", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-41-54.207641.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-41-54.207641.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:47.759549.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:17:47.759549.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:17:47.759549.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T23_41_54.207641", "path": ["**/details_harness|winogrande|5_2023-10-22T23-41-54.207641.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T23-41-54.207641.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_17_47.759549", "path": ["results_2023-07-19T17:17:47.759549.parquet"]}, {"split": "2023_10_22T23_41_54.207641", "path": ["results_2023-10-22T23-41-54.207641.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T23-41-54.207641.parquet"]}]}]}
2023-10-22T22:42:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/tulu-7B-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/tulu-7B-fp16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T23:41:54.207641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/tulu-7B-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/tulu-7B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T23:41:54.207641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/tulu-7B-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/tulu-7B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T23:41:54.207641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/tulu-7B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/tulu-7B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T23:41:54.207641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
584629af80e1d61bb079d4cfd65cbcad5d36aeab
# Dataset Card for Evaluation run of TheBloke/alpaca-lora-65B-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/alpaca-lora-65B-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/alpaca-lora-65B-HF](https://huggingface.co/TheBloke/alpaca-lora-65B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__alpaca-lora-65B-HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T02:09:35.586177](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__alpaca-lora-65B-HF/blob/main/results_2023-10-23T02-09-35.586177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.10255872483221476, "em_stderr": 0.0031069121780170463, "f1": 0.16075398489932788, "f1_stderr": 0.0032128112295639008, "acc": 0.546335119104964, "acc_stderr": 0.011676044797182322 }, "harness|drop|3": { "em": 0.10255872483221476, "em_stderr": 0.0031069121780170463, "f1": 0.16075398489932788, "f1_stderr": 0.0032128112295639008 }, "harness|gsm8k|5": { "acc": 0.2805155420773313, "acc_stderr": 0.012374608490929553 }, "harness|winogrande|5": { "acc": 0.8121546961325967, "acc_stderr": 0.010977481103435091 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__alpaca-lora-65B-HF
[ "region:us" ]
2023-08-18T10:29:01+00:00
{"pretty_name": "Evaluation run of TheBloke/alpaca-lora-65B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/alpaca-lora-65B-HF](https://huggingface.co/TheBloke/alpaca-lora-65B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__alpaca-lora-65B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:09:35.586177](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__alpaca-lora-65B-HF/blob/main/results_2023-10-23T02-09-35.586177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10255872483221476,\n \"em_stderr\": 0.0031069121780170463,\n \"f1\": 0.16075398489932788,\n \"f1_stderr\": 0.0032128112295639008,\n \"acc\": 0.546335119104964,\n \"acc_stderr\": 0.011676044797182322\n },\n \"harness|drop|3\": {\n \"em\": 0.10255872483221476,\n \"em_stderr\": 0.0031069121780170463,\n \"f1\": 0.16075398489932788,\n \"f1_stderr\": 0.0032128112295639008\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2805155420773313,\n \"acc_stderr\": 0.012374608490929553\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/alpaca-lora-65B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T02_09_35.586177", "path": ["**/details_harness|drop|3_2023-10-23T02-09-35.586177.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-09-35.586177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T02_09_35.586177", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-09-35.586177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-09-35.586177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:46:53.347899.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:46:53.347899.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:46:53.347899.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T02_09_35.586177", "path": ["**/details_harness|winogrande|5_2023-10-23T02-09-35.586177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-09-35.586177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T19_46_53.347899", "path": ["results_2023-07-25T19:46:53.347899.parquet"]}, {"split": "2023_10_23T02_09_35.586177", "path": ["results_2023-10-23T02-09-35.586177.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-09-35.586177.parquet"]}]}]}
2023-10-23T01:09:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/alpaca-lora-65B-HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/alpaca-lora-65B-HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T02:09:35.586177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/alpaca-lora-65B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/alpaca-lora-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T02:09:35.586177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/alpaca-lora-65B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/alpaca-lora-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T02:09:35.586177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/alpaca-lora-65B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/alpaca-lora-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:09:35.586177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
852f569546b23293c2fcdfdf6e302901e936d9c5
# Dataset Card for Evaluation run of TheBloke/llama-2-70b-Guanaco-QLoRA-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/llama-2-70b-Guanaco-QLoRA-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/llama-2-70b-Guanaco-QLoRA-fp16](https://huggingface.co/TheBloke/llama-2-70b-Guanaco-QLoRA-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__llama-2-70b-Guanaco-QLoRA-fp16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T03:53:16.698758](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__llama-2-70b-Guanaco-QLoRA-fp16/blob/main/results_2023-10-22T03-53-16.698758.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.05620805369127517, "em_stderr": 0.0023587236332230886, "f1": 0.11980180369127513, "f1_stderr": 0.002592264922824749, "acc": 0.5688660001089055, "acc_stderr": 0.011453570865550992 }, "harness|drop|3": { "em": 0.05620805369127517, "em_stderr": 0.0023587236332230886, "f1": 0.11980180369127513, "f1_stderr": 0.002592264922824749 }, "harness|gsm8k|5": { "acc": 0.2979529946929492, "acc_stderr": 0.012597932232914513 }, "harness|winogrande|5": { "acc": 0.8397790055248618, "acc_stderr": 0.010309209498187472 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__llama-2-70b-Guanaco-QLoRA-fp16
[ "region:us" ]
2023-08-18T10:29:10+00:00
{"pretty_name": "Evaluation run of TheBloke/llama-2-70b-Guanaco-QLoRA-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/llama-2-70b-Guanaco-QLoRA-fp16](https://huggingface.co/TheBloke/llama-2-70b-Guanaco-QLoRA-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__llama-2-70b-Guanaco-QLoRA-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T03:53:16.698758](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__llama-2-70b-Guanaco-QLoRA-fp16/blob/main/results_2023-10-22T03-53-16.698758.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05620805369127517,\n \"em_stderr\": 0.0023587236332230886,\n \"f1\": 0.11980180369127513,\n \"f1_stderr\": 0.002592264922824749,\n \"acc\": 0.5688660001089055,\n \"acc_stderr\": 0.011453570865550992\n },\n \"harness|drop|3\": {\n \"em\": 0.05620805369127517,\n \"em_stderr\": 0.0023587236332230886,\n \"f1\": 0.11980180369127513,\n \"f1_stderr\": 0.002592264922824749\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2979529946929492,\n \"acc_stderr\": 0.012597932232914513\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187472\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/llama-2-70b-Guanaco-QLoRA-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T03_53_16.698758", "path": ["**/details_harness|drop|3_2023-10-22T03-53-16.698758.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T03-53-16.698758.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T03_53_16.698758", "path": ["**/details_harness|gsm8k|5_2023-10-22T03-53-16.698758.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T03-53-16.698758.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:54:57.592623.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:54:57.592623.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:54:57.592623.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T03_53_16.698758", "path": ["**/details_harness|winogrande|5_2023-10-22T03-53-16.698758.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T03-53-16.698758.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T19_54_57.592623", "path": ["results_2023-07-25T19:54:57.592623.parquet"]}, {"split": "2023_10_22T03_53_16.698758", "path": ["results_2023-10-22T03-53-16.698758.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T03-53-16.698758.parquet"]}]}]}
2023-10-22T02:53:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/llama-2-70b-Guanaco-QLoRA-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/llama-2-70b-Guanaco-QLoRA-fp16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T03:53:16.698758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/llama-2-70b-Guanaco-QLoRA-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/llama-2-70b-Guanaco-QLoRA-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T03:53:16.698758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/llama-2-70b-Guanaco-QLoRA-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/llama-2-70b-Guanaco-QLoRA-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T03:53:16.698758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/llama-2-70b-Guanaco-QLoRA-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/llama-2-70b-Guanaco-QLoRA-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T03:53:16.698758(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
301b3644c7097054b7fa06a334029078f09088a8
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Uncensored-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-30B-Uncensored-fp16](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Uncensored-fp16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-19T13:45:18.299512](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Uncensored-fp16/blob/main/results_2023-10-19T13-45-18.299512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.18162751677852348, "em_stderr": 0.0039482621737543045, "f1": 0.2674087667785243, "f1_stderr": 0.004012090110572664, "acc": 0.46353130406008236, "acc_stderr": 0.01059244186586655 }, "harness|drop|3": { "em": 0.18162751677852348, "em_stderr": 0.0039482621737543045, "f1": 0.2674087667785243, "f1_stderr": 0.004012090110572664 }, "harness|gsm8k|5": { "acc": 0.1425322213798332, "acc_stderr": 0.009629588445673819 }, "harness|winogrande|5": { "acc": 0.7845303867403315, "acc_stderr": 0.011555295286059279 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Uncensored-fp16
[ "region:us" ]
2023-08-18T10:29:19+00:00
{"pretty_name": "Evaluation run of TheBloke/Wizard-Vicuna-30B-Uncensored-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-30B-Uncensored-fp16](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Uncensored-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T13:45:18.299512](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Uncensored-fp16/blob/main/results_2023-10-19T13-45-18.299512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.18162751677852348,\n \"em_stderr\": 0.0039482621737543045,\n \"f1\": 0.2674087667785243,\n \"f1_stderr\": 0.004012090110572664,\n \"acc\": 0.46353130406008236,\n \"acc_stderr\": 0.01059244186586655\n },\n \"harness|drop|3\": {\n \"em\": 0.18162751677852348,\n \"em_stderr\": 0.0039482621737543045,\n \"f1\": 0.2674087667785243,\n \"f1_stderr\": 0.004012090110572664\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1425322213798332,\n \"acc_stderr\": 0.009629588445673819\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059279\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T13_45_18.299512", "path": ["**/details_harness|drop|3_2023-10-19T13-45-18.299512.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T13-45-18.299512.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T13_45_18.299512", "path": ["**/details_harness|gsm8k|5_2023-10-19T13-45-18.299512.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T13-45-18.299512.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:48:26.116631.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:48:26.116631.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:48:26.116631.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T13_45_18.299512", "path": ["**/details_harness|winogrande|5_2023-10-19T13-45-18.299512.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T13-45-18.299512.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_48_26.116631", "path": ["results_2023-07-19T22:48:26.116631.parquet"]}, {"split": "2023_10_19T13_45_18.299512", "path": ["results_2023-10-19T13-45-18.299512.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T13-45-18.299512.parquet"]}]}]}
2023-10-19T12:45:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Uncensored-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Uncensored-fp16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-19T13:45:18.299512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Uncensored-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Uncensored-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T13:45:18.299512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Uncensored-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Uncensored-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T13:45:18.299512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Uncensored-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Uncensored-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T13:45:18.299512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6978a8cf5116988f70372510b6f562f3787c15eb
# Dataset Card for Evaluation run of TheBloke/vicuna-13B-1.1-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/vicuna-13B-1.1-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/vicuna-13B-1.1-HF](https://huggingface.co/TheBloke/vicuna-13B-1.1-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__vicuna-13B-1.1-HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T02:01:12.621227](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__vicuna-13B-1.1-HF/blob/main/results_2023-10-23T02-01-12.621227.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.029677013422818792, "em_stderr": 0.0017378324714143493, "f1": 0.09310612416107406, "f1_stderr": 0.002167792401176146, "acc": 0.4141695683211732, "acc_stderr": 0.010019161585538096 }, "harness|drop|3": { "em": 0.029677013422818792, "em_stderr": 0.0017378324714143493, "f1": 0.09310612416107406, "f1_stderr": 0.002167792401176146 }, "harness|gsm8k|5": { "acc": 0.08642911296436695, "acc_stderr": 0.00774004433710381 }, "harness|winogrande|5": { "acc": 0.7419100236779794, "acc_stderr": 0.012298278833972384 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__vicuna-13B-1.1-HF
[ "region:us" ]
2023-08-18T10:29:27+00:00
{"pretty_name": "Evaluation run of TheBloke/vicuna-13B-1.1-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/vicuna-13B-1.1-HF](https://huggingface.co/TheBloke/vicuna-13B-1.1-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__vicuna-13B-1.1-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:01:12.621227](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__vicuna-13B-1.1-HF/blob/main/results_2023-10-23T02-01-12.621227.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146,\n \"acc\": 0.4141695683211732,\n \"acc_stderr\": 0.010019161585538096\n },\n \"harness|drop|3\": {\n \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.00774004433710381\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/vicuna-13B-1.1-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|arc:challenge|25_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T02_01_12.621227", "path": ["**/details_harness|drop|3_2023-10-23T02-01-12.621227.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-01-12.621227.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T02_01_12.621227", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-01-12.621227.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-01-12.621227.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hellaswag|10_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T13:57:49.812019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T13:57:49.812019.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T13:57:49.812019.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T02_01_12.621227", "path": ["**/details_harness|winogrande|5_2023-10-23T02-01-12.621227.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-01-12.621227.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T13_57_49.812019", "path": ["results_2023-07-18T13:57:49.812019.parquet"]}, {"split": "2023_10_23T02_01_12.621227", "path": ["results_2023-10-23T02-01-12.621227.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-01-12.621227.parquet"]}]}]}
2023-10-23T01:01:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/vicuna-13B-1.1-HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/vicuna-13B-1.1-HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T02:01:12.621227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/vicuna-13B-1.1-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/vicuna-13B-1.1-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T02:01:12.621227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/vicuna-13B-1.1-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/vicuna-13B-1.1-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T02:01:12.621227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/vicuna-13B-1.1-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/vicuna-13B-1.1-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:01:12.621227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b7fe48db6389070c4239cc4ed0fa7ce5703efbf8
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-7B-Uncensored-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/Wizard-Vicuna-7B-Uncensored-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-7B-Uncensored-HF](https://huggingface.co/TheBloke/Wizard-Vicuna-7B-Uncensored-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-7B-Uncensored-HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T23:25:47.452800](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-7B-Uncensored-HF/blob/main/results_2023-10-22T23-25-47.452800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.18036912751677853, "em_stderr": 0.003937584689736024, "f1": 0.23801803691275183, "f1_stderr": 0.003988701736112215, "acc": 0.3838336904677134, "acc_stderr": 0.009164287920296908 }, "harness|drop|3": { "em": 0.18036912751677853, "em_stderr": 0.003937584689736024, "f1": 0.23801803691275183, "f1_stderr": 0.003988701736112215 }, "harness|gsm8k|5": { "acc": 0.045489006823351025, "acc_stderr": 0.005739657656722215 }, "harness|winogrande|5": { "acc": 0.7221783741120757, "acc_stderr": 0.012588918183871601 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-7B-Uncensored-HF
[ "region:us" ]
2023-08-18T10:29:36+00:00
{"pretty_name": "Evaluation run of TheBloke/Wizard-Vicuna-7B-Uncensored-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-7B-Uncensored-HF](https://huggingface.co/TheBloke/Wizard-Vicuna-7B-Uncensored-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-7B-Uncensored-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T23:25:47.452800](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-7B-Uncensored-HF/blob/main/results_2023-10-22T23-25-47.452800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.18036912751677853,\n \"em_stderr\": 0.003937584689736024,\n \"f1\": 0.23801803691275183,\n \"f1_stderr\": 0.003988701736112215,\n \"acc\": 0.3838336904677134,\n \"acc_stderr\": 0.009164287920296908\n },\n \"harness|drop|3\": {\n \"em\": 0.18036912751677853,\n \"em_stderr\": 0.003937584689736024,\n \"f1\": 0.23801803691275183,\n \"f1_stderr\": 0.003988701736112215\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \"acc_stderr\": 0.005739657656722215\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871601\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Wizard-Vicuna-7B-Uncensored-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T23_25_47.452800", "path": ["**/details_harness|drop|3_2023-10-22T23-25-47.452800.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T23-25-47.452800.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T23_25_47.452800", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-25-47.452800.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-25-47.452800.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:11:01.220046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:11:01.220046.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:11:01.220046.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T23_25_47.452800", "path": ["**/details_harness|winogrande|5_2023-10-22T23-25-47.452800.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T23-25-47.452800.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_11_01.220046", "path": ["results_2023-07-19T17:11:01.220046.parquet"]}, {"split": "2023_10_22T23_25_47.452800", "path": ["results_2023-10-22T23-25-47.452800.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T23-25-47.452800.parquet"]}]}]}
2023-10-22T22:25:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-7B-Uncensored-HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-7B-Uncensored-HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T23:25:47.452800(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-7B-Uncensored-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-7B-Uncensored-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T23:25:47.452800(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-7B-Uncensored-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-7B-Uncensored-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T23:25:47.452800(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-7B-Uncensored-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-7B-Uncensored-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T23:25:47.452800(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e0101d17f6344594bdec1f9b347bf592b6b19a27
# Dataset Card for Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-31T19:04:33.192118](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A04%3A33.192118.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2367148405069541, "acc_stderr": 0.030958077810881182, "acc_norm": 0.23838963087978138, "acc_norm_stderr": 0.030974710079953026, "mc1": 0.23378212974296206, "mc1_stderr": 0.01481619599193159, "mc2": 0.4693099566156165, "mc2_stderr": 0.01667201792733067 }, "harness|arc:challenge|25": { "acc": 0.21331058020477817, "acc_stderr": 0.011970971742326334, "acc_norm": 0.25426621160409557, "acc_norm_stderr": 0.012724999945157744 }, "harness|hellaswag|10": { "acc": 0.28828918542123083, "acc_stderr": 0.00452040633108404, "acc_norm": 0.3461461860187214, "acc_norm_stderr": 0.004747682003491466 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.24444444444444444, "acc_stderr": 0.03712537833614865, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.03712537833614865 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.025288394502891373, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.025288394502891373 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.25, "acc_stderr": 0.03621034121889507, "acc_norm": 0.25, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.04096985139843671, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.04096985139843671 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2275132275132275, "acc_stderr": 0.02159126940782378, "acc_norm": 0.2275132275132275, "acc_norm_stderr": 0.02159126940782378 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.20634920634920634, "acc_stderr": 0.0361960452412425, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.0361960452412425 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2838709677419355, "acc_stderr": 0.025649381063029254, "acc_norm": 0.2838709677419355, "acc_norm_stderr": 0.025649381063029254 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.24630541871921183, "acc_stderr": 0.030315099285617722, "acc_norm": 0.24630541871921183, "acc_norm_stderr": 0.030315099285617722 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2727272727272727, "acc_stderr": 0.0347769116216366, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.18181818181818182, "acc_stderr": 0.027479603010538797, "acc_norm": 0.18181818181818182, "acc_norm_stderr": 0.027479603010538797 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860702, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860702 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20512820512820512, "acc_stderr": 0.020473233173551982, "acc_norm": 0.20512820512820512, "acc_norm_stderr": 0.020473233173551982 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23703703703703705, "acc_stderr": 0.02592887613276612, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.02592887613276612 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.027553614467863818, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.027553614467863818 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.19205298013245034, "acc_stderr": 0.032162984205936135, "acc_norm": 0.19205298013245034, "acc_norm_stderr": 0.032162984205936135 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21467889908256882, "acc_stderr": 0.01760430414925649, "acc_norm": 0.21467889908256882, "acc_norm_stderr": 0.01760430414925649 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03005820270430985, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03005820270430985 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.242152466367713, "acc_stderr": 0.028751392398694755, "acc_norm": 0.242152466367713, "acc_norm_stderr": 0.028751392398694755 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070416, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.26851851851851855, "acc_stderr": 0.04284467968052192, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.04284467968052192 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22699386503067484, "acc_stderr": 0.03291099578615767, "acc_norm": 0.22699386503067484, "acc_norm_stderr": 0.03291099578615767 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841043, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841043 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.23931623931623933, "acc_stderr": 0.027951826808924333, "acc_norm": 0.23931623931623933, "acc_norm_stderr": 0.027951826808924333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2554278416347382, "acc_stderr": 0.015594955384455772, "acc_norm": 0.2554278416347382, "acc_norm_stderr": 0.015594955384455772 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.20520231213872833, "acc_stderr": 0.021742519835276287, "acc_norm": 0.20520231213872833, "acc_norm_stderr": 0.021742519835276287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23910614525139665, "acc_stderr": 0.014265554192331144, "acc_norm": 0.23910614525139665, "acc_norm_stderr": 0.014265554192331144 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.21568627450980393, "acc_stderr": 0.02355083135199509, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.02355083135199509 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2191358024691358, "acc_stderr": 0.023016705640262203, "acc_norm": 0.2191358024691358, "acc_norm_stderr": 0.023016705640262203 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24822695035460993, "acc_stderr": 0.025770015644290392, "acc_norm": 0.24822695035460993, "acc_norm_stderr": 0.025770015644290392 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24771838331160365, "acc_stderr": 0.011025499291443738, "acc_norm": 0.24771838331160365, "acc_norm_stderr": 0.011025499291443738 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.21323529411764705, "acc_stderr": 0.024880971512294275, "acc_norm": 0.21323529411764705, "acc_norm_stderr": 0.024880971512294275 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2549019607843137, "acc_stderr": 0.017630827375148383, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.017630827375148383 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2, "acc_stderr": 0.03831305140884601, "acc_norm": 0.2, "acc_norm_stderr": 0.03831305140884601 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.19183673469387755, "acc_stderr": 0.025206963154225378, "acc_norm": 0.19183673469387755, "acc_norm_stderr": 0.025206963154225378 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.23, "acc_stderr": 0.04229525846816508, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816508 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2222222222222222, "acc_stderr": 0.03188578017686399, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.03188578017686399 }, "harness|truthfulqa:mc|0": { "mc1": 0.23378212974296206, "mc1_stderr": 0.01481619599193159, "mc2": 0.4693099566156165, "mc2_stderr": 0.01667201792733067 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16
[ "region:us" ]
2023-08-18T10:29:45+00:00
{"pretty_name": "Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-31T19:04:33.192118](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A04%3A33.192118.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2367148405069541,\n \"acc_stderr\": 0.030958077810881182,\n \"acc_norm\": 0.23838963087978138,\n \"acc_norm_stderr\": 0.030974710079953026,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193159,\n \"mc2\": 0.4693099566156165,\n \"mc2_stderr\": 0.01667201792733067\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n \"acc_norm\": 0.25426621160409557,\n \"acc_norm_stderr\": 0.012724999945157744\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28828918542123083,\n \"acc_stderr\": 0.00452040633108404,\n \"acc_norm\": 0.3461461860187214,\n \"acc_norm_stderr\": 0.004747682003491466\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.03712537833614865,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.03712537833614865\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891373,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891373\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2275132275132275,\n \"acc_stderr\": 0.02159126940782378,\n \"acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.02159126940782378\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n \"acc_stderr\": 0.025649381063029254,\n \"acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.025649381063029254\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617722,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617722\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860702,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860702\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.020473233173551982,\n \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.020473233173551982\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863818,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863818\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.19205298013245034,\n \"acc_stderr\": 0.032162984205936135,\n \"acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.032162984205936135\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21467889908256882,\n \"acc_stderr\": 0.01760430414925649,\n \"acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.01760430414925649\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03005820270430985,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03005820270430985\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615767,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615767\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n \"acc_stderr\": 0.015594955384455772,\n \"acc_norm\": 0.2554278416347382,\n \"acc_norm_stderr\": 0.015594955384455772\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.20520231213872833,\n \"acc_stderr\": 0.021742519835276287,\n \"acc_norm\": 0.20520231213872833,\n \"acc_norm_stderr\": 0.021742519835276287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262203,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262203\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n \"acc_stderr\": 0.011025499291443738,\n \"acc_norm\": 0.24771838331160365,\n \"acc_norm_stderr\": 0.011025499291443738\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294275,\n \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294275\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225378,\n \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193159,\n \"mc2\": 0.4693099566156165,\n \"mc2_stderr\": 0.01667201792733067\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|arc:challenge|25_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hellaswag|10_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T19:04:33.192118.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T19_04_33.192118", "path": ["results_2023-07-31T19:04:33.192118.parquet"]}, {"split": "latest", "path": ["results_2023-07-31T19:04:33.192118.parquet"]}]}]}
2023-08-27T11:34:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-31T19:04:33.192118 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-31T19:04:33.192118 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-31T19:04:33.192118 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-31T19:04:33.192118 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
12e8790488a75dc2292cede7850ded3c9168465e
# Dataset Card for Evaluation run of TheBloke/GPlatty-30B-SuperHOT-8K-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/GPlatty-30B-SuperHOT-8K-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/GPlatty-30B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/GPlatty-30B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__GPlatty-30B-SuperHOT-8K-fp16", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-01T15:51:23.628970](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__GPlatty-30B-SuperHOT-8K-fp16/blob/main/results_2023-08-01T15%3A51%3A23.628970.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.24941704039386783, "acc_stderr": 0.0314384194357432, "acc_norm": 0.2512238671780757, "acc_norm_stderr": 0.03145763914734606, "mc1": 0.22888616891064872, "mc1_stderr": 0.014706994909055027, "mc2": 0.46272712607124966, "mc2_stderr": 0.016702158477967525 }, "harness|arc:challenge|25": { "acc": 0.22696245733788395, "acc_stderr": 0.012240491536132868, "acc_norm": 0.2832764505119454, "acc_norm_stderr": 0.013167478735134576 }, "harness|hellaswag|10": { "acc": 0.28450507866958774, "acc_stderr": 0.004502563079349398, "acc_norm": 0.33479386576379205, "acc_norm_stderr": 0.0047095388649163105 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2518518518518518, "acc_stderr": 0.037498507091740206, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.037498507091740206 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.19736842105263158, "acc_stderr": 0.03238981601699397, "acc_norm": 0.19736842105263158, "acc_norm_stderr": 0.03238981601699397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.22641509433962265, "acc_stderr": 0.02575755989310675, "acc_norm": 0.22641509433962265, "acc_norm_stderr": 0.02575755989310675 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2361111111111111, "acc_stderr": 0.03551446610810826, "acc_norm": 0.2361111111111111, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.24, "acc_stderr": 0.04292346959909282, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.19, "acc_stderr": 0.039427724440366234, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.04389869956808777, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.04389869956808777 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2851063829787234, "acc_stderr": 0.029513196625539355, "acc_norm": 0.2851063829787234, "acc_norm_stderr": 0.029513196625539355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21052631578947367, "acc_stderr": 0.03835153954399421, "acc_norm": 0.21052631578947367, "acc_norm_stderr": 0.03835153954399421 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.27586206896551724, "acc_stderr": 0.037245636197746325, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.037245636197746325 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.21428571428571427, "acc_stderr": 0.02113285918275444, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.02113285918275444 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2645161290322581, "acc_stderr": 0.025091892378859275, "acc_norm": 0.2645161290322581, "acc_norm_stderr": 0.025091892378859275 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.22167487684729065, "acc_stderr": 0.029225575892489624, "acc_norm": 0.22167487684729065, "acc_norm_stderr": 0.029225575892489624 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.29, "acc_stderr": 0.04560480215720683, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.21717171717171718, "acc_stderr": 0.029376616484945633, "acc_norm": 0.21717171717171718, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.18134715025906736, "acc_stderr": 0.02780703236068609, "acc_norm": 0.18134715025906736, "acc_norm_stderr": 0.02780703236068609 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2717948717948718, "acc_stderr": 0.022556551010132354, "acc_norm": 0.2717948717948718, "acc_norm_stderr": 0.022556551010132354 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2037037037037037, "acc_stderr": 0.024556172219141265, "acc_norm": 0.2037037037037037, "acc_norm_stderr": 0.024556172219141265 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.22268907563025211, "acc_stderr": 0.027025433498882385, "acc_norm": 0.22268907563025211, "acc_norm_stderr": 0.027025433498882385 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.19205298013245034, "acc_stderr": 0.032162984205936135, "acc_norm": 0.19205298013245034, "acc_norm_stderr": 0.032162984205936135 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.22568807339449543, "acc_stderr": 0.017923087667803053, "acc_norm": 0.22568807339449543, "acc_norm_stderr": 0.017923087667803053 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.26851851851851855, "acc_stderr": 0.030225226160012397, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.030225226160012397 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.03019028245350195, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2869198312236287, "acc_stderr": 0.029443773022594693, "acc_norm": 0.2869198312236287, "acc_norm_stderr": 0.029443773022594693 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.2914798206278027, "acc_stderr": 0.030500283176545902, "acc_norm": 0.2914798206278027, "acc_norm_stderr": 0.030500283176545902 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2748091603053435, "acc_stderr": 0.03915345408847835, "acc_norm": 0.2748091603053435, "acc_norm_stderr": 0.03915345408847835 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2727272727272727, "acc_stderr": 0.04065578140908705, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.26851851851851855, "acc_stderr": 0.04284467968052191, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.04284467968052191 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26993865030674846, "acc_stderr": 0.03487825168497892, "acc_norm": 0.26993865030674846, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2863247863247863, "acc_stderr": 0.029614323690456648, "acc_norm": 0.2863247863247863, "acc_norm_stderr": 0.029614323690456648 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.3065134099616858, "acc_stderr": 0.01648695289304151, "acc_norm": 0.3065134099616858, "acc_norm_stderr": 0.01648695289304151 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.23121387283236994, "acc_stderr": 0.022698657167855716, "acc_norm": 0.23121387283236994, "acc_norm_stderr": 0.022698657167855716 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24581005586592178, "acc_stderr": 0.014400296429225629, "acc_norm": 0.24581005586592178, "acc_norm_stderr": 0.014400296429225629 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3104575163398693, "acc_stderr": 0.026493033225145894, "acc_norm": 0.3104575163398693, "acc_norm_stderr": 0.026493033225145894 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.27009646302250806, "acc_stderr": 0.025218040373410612, "acc_norm": 0.27009646302250806, "acc_norm_stderr": 0.025218040373410612 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2345679012345679, "acc_stderr": 0.023576881744005716, "acc_norm": 0.2345679012345679, "acc_norm_stderr": 0.023576881744005716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.25177304964539005, "acc_stderr": 0.025892151156709405, "acc_norm": 0.25177304964539005, "acc_norm_stderr": 0.025892151156709405 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.26597131681877445, "acc_stderr": 0.011285033165551274, "acc_norm": 0.26597131681877445, "acc_norm_stderr": 0.011285033165551274 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.17647058823529413, "acc_stderr": 0.02315746830855934, "acc_norm": 0.17647058823529413, "acc_norm_stderr": 0.02315746830855934 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.27450980392156865, "acc_stderr": 0.018054027458815198, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.018054027458815198 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2, "acc_stderr": 0.03831305140884601, "acc_norm": 0.2, "acc_norm_stderr": 0.03831305140884601 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.20816326530612245, "acc_stderr": 0.025991117672813292, "acc_norm": 0.20816326530612245, "acc_norm_stderr": 0.025991117672813292 }, "harness|hendrycksTest-sociology|5": { "acc": 0.26865671641791045, "acc_stderr": 0.03134328358208954, "acc_norm": 0.26865671641791045, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-virology|5": { "acc": 0.2710843373493976, "acc_stderr": 0.03460579907553027, "acc_norm": 0.2710843373493976, "acc_norm_stderr": 0.03460579907553027 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.29239766081871343, "acc_stderr": 0.034886477134579215, "acc_norm": 0.29239766081871343, "acc_norm_stderr": 0.034886477134579215 }, "harness|truthfulqa:mc|0": { "mc1": 0.22888616891064872, "mc1_stderr": 0.014706994909055027, "mc2": 0.46272712607124966, "mc2_stderr": 0.016702158477967525 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__GPlatty-30B-SuperHOT-8K-fp16
[ "region:us" ]
2023-08-18T10:29:53+00:00
{"pretty_name": "Evaluation run of TheBloke/GPlatty-30B-SuperHOT-8K-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/GPlatty-30B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/GPlatty-30B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__GPlatty-30B-SuperHOT-8K-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-01T15:51:23.628970](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__GPlatty-30B-SuperHOT-8K-fp16/blob/main/results_2023-08-01T15%3A51%3A23.628970.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24941704039386783,\n \"acc_stderr\": 0.0314384194357432,\n \"acc_norm\": 0.2512238671780757,\n \"acc_norm_stderr\": 0.03145763914734606,\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.46272712607124966,\n \"mc2_stderr\": 0.016702158477967525\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132868,\n \"acc_norm\": 0.2832764505119454,\n \"acc_norm_stderr\": 0.013167478735134576\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28450507866958774,\n \"acc_stderr\": 0.004502563079349398,\n \"acc_norm\": 0.33479386576379205,\n \"acc_norm_stderr\": 0.0047095388649163105\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.02575755989310675,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.02575755989310675\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03835153954399421,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03835153954399421\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02113285918275444,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02113285918275444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.025091892378859275,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.025091892378859275\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489624,\n \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489624\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.18134715025906736,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.18134715025906736,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132354,\n \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.024556172219141265,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.024556172219141265\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882385,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882385\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.19205298013245034,\n \"acc_stderr\": 0.032162984205936135,\n \"acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.032162984205936135\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22568807339449543,\n \"acc_stderr\": 0.017923087667803053,\n \"acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.017923087667803053\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012397,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012397\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.029443773022594693,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.029443773022594693\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n \"acc_stderr\": 0.030500283176545902,\n \"acc_norm\": 0.2914798206278027,\n \"acc_norm_stderr\": 0.030500283176545902\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847835,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847835\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3065134099616858,\n \"acc_stderr\": 0.01648695289304151,\n \"acc_norm\": 0.3065134099616858,\n \"acc_norm_stderr\": 0.01648695289304151\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855716,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225629,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225629\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3104575163398693,\n \"acc_stderr\": 0.026493033225145894,\n \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.026493033225145894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n \"acc_stderr\": 0.025218040373410612,\n \"acc_norm\": 0.27009646302250806,\n \"acc_norm_stderr\": 0.025218040373410612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n \"acc_stderr\": 0.011285033165551274,\n \"acc_norm\": 0.26597131681877445,\n \"acc_norm_stderr\": 0.011285033165551274\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.02315746830855934,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.02315746830855934\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815198,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815198\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813292,\n \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813292\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.46272712607124966,\n \"mc2_stderr\": 0.016702158477967525\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/GPlatty-30B-SuperHOT-8K-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|arc:challenge|25_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hellaswag|10_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T15:51:23.628970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T15:51:23.628970.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T15_51_23.628970", "path": ["results_2023-08-01T15:51:23.628970.parquet"]}, {"split": "latest", "path": ["results_2023-08-01T15:51:23.628970.parquet"]}]}]}
2023-08-27T11:34:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/GPlatty-30B-SuperHOT-8K-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/GPlatty-30B-SuperHOT-8K-fp16 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-01T15:51:23.628970 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/GPlatty-30B-SuperHOT-8K-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/GPlatty-30B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-01T15:51:23.628970 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/GPlatty-30B-SuperHOT-8K-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/GPlatty-30B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-01T15:51:23.628970 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/GPlatty-30B-SuperHOT-8K-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/GPlatty-30B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-01T15:51:23.628970 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5de7aa57d4cbb90d31a02b6ab6e81013f1157ca3
# Dataset Card for Evaluation run of anton-l/gpt-j-tiny-random ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/anton-l/gpt-j-tiny-random - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [anton-l/gpt-j-tiny-random](https://huggingface.co/anton-l/gpt-j-tiny-random) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_anton-l__gpt-j-tiny-random", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T06:54:36.859964](https://huggingface.co/datasets/open-llm-leaderboard/details_anton-l__gpt-j-tiny-random/blob/main/results_2023-10-28T06-54-36.859964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 9.857382550335573e-05, "f1_stderr": 2.430375363900546e-05, "acc": 0.2474348855564325, "acc_stderr": 0.007025872980895258 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 9.857382550335573e-05, "f1_stderr": 2.430375363900546e-05 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.494869771112865, "acc_stderr": 0.014051745961790516 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_anton-l__gpt-j-tiny-random
[ "region:us" ]
2023-08-18T10:30:03+00:00
{"pretty_name": "Evaluation run of anton-l/gpt-j-tiny-random", "dataset_summary": "Dataset automatically created during the evaluation run of model [anton-l/gpt-j-tiny-random](https://huggingface.co/anton-l/gpt-j-tiny-random) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_anton-l__gpt-j-tiny-random\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T06:54:36.859964](https://huggingface.co/datasets/open-llm-leaderboard/details_anton-l__gpt-j-tiny-random/blob/main/results_2023-10-28T06-54-36.859964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 9.857382550335573e-05,\n \"f1_stderr\": 2.430375363900546e-05,\n \"acc\": 0.2474348855564325,\n \"acc_stderr\": 0.007025872980895258\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 9.857382550335573e-05,\n \"f1_stderr\": 2.430375363900546e-05\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.494869771112865,\n \"acc_stderr\": 0.014051745961790516\n }\n}\n```", "repo_url": "https://huggingface.co/anton-l/gpt-j-tiny-random", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T06_54_36.859964", "path": ["**/details_harness|drop|3_2023-10-28T06-54-36.859964.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T06-54-36.859964.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T06_54_36.859964", "path": ["**/details_harness|gsm8k|5_2023-10-28T06-54-36.859964.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T06-54-36.859964.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:12:24.842449.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:12:24.842449.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:12:24.842449.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T06_54_36.859964", "path": ["**/details_harness|winogrande|5_2023-10-28T06-54-36.859964.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T06-54-36.859964.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T16_12_24.842449", "path": ["results_2023-07-18T16:12:24.842449.parquet"]}, {"split": "2023_10_28T06_54_36.859964", "path": ["results_2023-10-28T06-54-36.859964.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T06-54-36.859964.parquet"]}]}]}
2023-10-28T05:54:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of anton-l/gpt-j-tiny-random ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model anton-l/gpt-j-tiny-random on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T06:54:36.859964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of anton-l/gpt-j-tiny-random", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model anton-l/gpt-j-tiny-random on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T06:54:36.859964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of anton-l/gpt-j-tiny-random", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model anton-l/gpt-j-tiny-random on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T06:54:36.859964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of anton-l/gpt-j-tiny-random## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model anton-l/gpt-j-tiny-random on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T06:54:36.859964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a33b58ce730e4e82caed22de9c4a69dc0dcf2a39
# Dataset Card for Evaluation run of SLAM-group/NewHope ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/SLAM-group/NewHope - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [SLAM-group/NewHope](https://huggingface.co/SLAM-group/NewHope) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SLAM-group__NewHope", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-02T16:20:26.294433](https://huggingface.co/datasets/open-llm-leaderboard/details_SLAM-group__NewHope/blob/main/results_2023-08-02T16%3A20%3A26.294433.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5588691829632426, "acc_stderr": 0.03433115773924322, "acc_norm": 0.5628652703397449, "acc_norm_stderr": 0.03430877590228174, "mc1": 0.3243574051407589, "mc1_stderr": 0.016387976779647935, "mc2": 0.44868368066946906, "mc2_stderr": 0.015140951474620613 }, "harness|arc:challenge|25": { "acc": 0.5767918088737202, "acc_stderr": 0.014438036220848022, "acc_norm": 0.6092150170648464, "acc_norm_stderr": 0.014258563880513782 }, "harness|hellaswag|10": { "acc": 0.6366261700856403, "acc_stderr": 0.004799882248494812, "acc_norm": 0.8399721171081458, "acc_norm_stderr": 0.003658826208101608 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5328947368421053, "acc_stderr": 0.040601270352363966, "acc_norm": 0.5328947368421053, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5811320754716981, "acc_stderr": 0.03036505082911521, "acc_norm": 0.5811320754716981, "acc_norm_stderr": 0.03036505082911521 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5763888888888888, "acc_stderr": 0.04132125019723369, "acc_norm": 0.5763888888888888, "acc_norm_stderr": 0.04132125019723369 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.047609522856952344, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952344 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5260115606936416, "acc_stderr": 0.038073017265045125, "acc_norm": 0.5260115606936416, "acc_norm_stderr": 0.038073017265045125 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.0433643270799318, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.0433643270799318 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4595744680851064, "acc_stderr": 0.03257901482099835, "acc_norm": 0.4595744680851064, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.04142439719489361, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.04142439719489361 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.023919984164047732, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.023919984164047732 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.042857142857142816, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.042857142857142816 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6870967741935484, "acc_stderr": 0.02637756702864586, "acc_norm": 0.6870967741935484, "acc_norm_stderr": 0.02637756702864586 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.39901477832512317, "acc_stderr": 0.03445487686264715, "acc_norm": 0.39901477832512317, "acc_norm_stderr": 0.03445487686264715 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6606060606060606, "acc_stderr": 0.03697442205031595, "acc_norm": 0.6606060606060606, "acc_norm_stderr": 0.03697442205031595 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.032424979581788166, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.032424979581788166 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5256410256410257, "acc_stderr": 0.025317649726448656, "acc_norm": 0.5256410256410257, "acc_norm_stderr": 0.025317649726448656 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547307, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547307 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6134453781512605, "acc_stderr": 0.03163145807552379, "acc_norm": 0.6134453781512605, "acc_norm_stderr": 0.03163145807552379 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658753, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658753 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7211009174311926, "acc_stderr": 0.01922746887646351, "acc_norm": 0.7211009174311926, "acc_norm_stderr": 0.01922746887646351 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.029331162294251735, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.029331162294251735 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.028458820991460295, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.028458820991460295 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835794, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6259541984732825, "acc_stderr": 0.042438692422305246, "acc_norm": 0.6259541984732825, "acc_norm_stderr": 0.042438692422305246 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7107438016528925, "acc_stderr": 0.04139112727635463, "acc_norm": 0.7107438016528925, "acc_norm_stderr": 0.04139112727635463 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6944444444444444, "acc_stderr": 0.04453197507374983, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.04453197507374983 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.2767857142857143, "acc_stderr": 0.04246624336697625, "acc_norm": 0.2767857142857143, "acc_norm_stderr": 0.04246624336697625 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7991452991452992, "acc_stderr": 0.026246772946890467, "acc_norm": 0.7991452991452992, "acc_norm_stderr": 0.026246772946890467 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7496807151979565, "acc_stderr": 0.015491088951494569, "acc_norm": 0.7496807151979565, "acc_norm_stderr": 0.015491088951494569 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6358381502890174, "acc_stderr": 0.025906632631016127, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.025906632631016127 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38100558659217876, "acc_stderr": 0.016242028834053613, "acc_norm": 0.38100558659217876, "acc_norm_stderr": 0.016242028834053613 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6405228758169934, "acc_stderr": 0.027475969910660952, "acc_norm": 0.6405228758169934, "acc_norm_stderr": 0.027475969910660952 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6366559485530546, "acc_stderr": 0.027316847674192714, "acc_norm": 0.6366559485530546, "acc_norm_stderr": 0.027316847674192714 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6388888888888888, "acc_stderr": 0.026725868809100793, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.026725868809100793 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.425531914893617, "acc_stderr": 0.02949482760014438, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.02949482760014438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44589308996088656, "acc_stderr": 0.012695244711379776, "acc_norm": 0.44589308996088656, "acc_norm_stderr": 0.012695244711379776 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5588235294117647, "acc_stderr": 0.03016191193076711, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.03016191193076711 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5473856209150327, "acc_stderr": 0.020136790918492523, "acc_norm": 0.5473856209150327, "acc_norm_stderr": 0.020136790918492523 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6285714285714286, "acc_stderr": 0.03093285879278985, "acc_norm": 0.6285714285714286, "acc_norm_stderr": 0.03093285879278985 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.031157150869355558, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.031157150869355558 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263686, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.03158149539338734, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.3243574051407589, "mc1_stderr": 0.016387976779647935, "mc2": 0.44868368066946906, "mc2_stderr": 0.015140951474620613 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_SLAM-group__NewHope
[ "region:us" ]
2023-08-18T10:30:11+00:00
{"pretty_name": "Evaluation run of SLAM-group/NewHope", "dataset_summary": "Dataset automatically created during the evaluation run of model [SLAM-group/NewHope](https://huggingface.co/SLAM-group/NewHope) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SLAM-group__NewHope\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-02T16:20:26.294433](https://huggingface.co/datasets/open-llm-leaderboard/details_SLAM-group__NewHope/blob/main/results_2023-08-02T16%3A20%3A26.294433.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5588691829632426,\n \"acc_stderr\": 0.03433115773924322,\n \"acc_norm\": 0.5628652703397449,\n \"acc_norm_stderr\": 0.03430877590228174,\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.016387976779647935,\n \"mc2\": 0.44868368066946906,\n \"mc2_stderr\": 0.015140951474620613\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848022,\n \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513782\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6366261700856403,\n \"acc_stderr\": 0.004799882248494812,\n \"acc_norm\": 0.8399721171081458,\n \"acc_norm_stderr\": 0.003658826208101608\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.03036505082911521,\n \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.03036505082911521\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.0433643270799318,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.0433643270799318\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448656,\n \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448656\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552379,\n \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552379\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7211009174311926,\n \"acc_stderr\": 0.01922746887646351,\n \"acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.01922746887646351\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460295,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460295\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890467,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890467\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n \"acc_stderr\": 0.015491088951494569,\n \"acc_norm\": 0.7496807151979565,\n \"acc_norm_stderr\": 0.015491088951494569\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n \"acc_stderr\": 0.016242028834053613,\n \"acc_norm\": 0.38100558659217876,\n \"acc_norm_stderr\": 0.016242028834053613\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n \"acc_stderr\": 0.027316847674192714,\n \"acc_norm\": 0.6366559485530546,\n \"acc_norm_stderr\": 0.027316847674192714\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014438,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379776,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379776\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.03016191193076711,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.03016191193076711\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492523,\n \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492523\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.016387976779647935,\n \"mc2\": 0.44868368066946906,\n \"mc2_stderr\": 0.015140951474620613\n }\n}\n```", "repo_url": "https://huggingface.co/SLAM-group/NewHope", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|arc:challenge|25_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hellaswag|10_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T16:20:26.294433.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T16:20:26.294433.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T16_20_26.294433", "path": ["results_2023-08-02T16:20:26.294433.parquet"]}, {"split": "latest", "path": ["results_2023-08-02T16:20:26.294433.parquet"]}]}]}
2023-08-27T11:34:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SLAM-group/NewHope ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model SLAM-group/NewHope on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-02T16:20:26.294433 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of SLAM-group/NewHope", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model SLAM-group/NewHope on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-02T16:20:26.294433 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SLAM-group/NewHope", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model SLAM-group/NewHope on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-02T16:20:26.294433 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SLAM-group/NewHope## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model SLAM-group/NewHope on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-02T16:20:26.294433 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ee609bc10b22eed666125ce2339ed3ca4e4ae686
# Dataset Card for Evaluation run of ziqingyang/chinese-alpaca-2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ziqingyang/chinese-alpaca-2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ziqingyang/chinese-alpaca-2-7b](https://huggingface.co/ziqingyang/chinese-alpaca-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T20:52:04.349396](https://huggingface.co/datasets/open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-7b/blob/main/results_2023-10-16T20-52-04.349396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.31008808724832215, "em_stderr": 0.00473673419159097, "f1": 0.3665834731543636, "f1_stderr": 0.004667666858281671, "acc": 0.37884916762058746, "acc_stderr": 0.009648470423229465 }, "harness|drop|3": { "em": 0.31008808724832215, "em_stderr": 0.00473673419159097, "f1": 0.3665834731543636, "f1_stderr": 0.004667666858281671 }, "harness|gsm8k|5": { "acc": 0.0576194086429113, "acc_stderr": 0.006418593319822861 }, "harness|winogrande|5": { "acc": 0.7000789265982637, "acc_stderr": 0.01287834752663607 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-7b
[ "region:us" ]
2023-08-18T10:30:20+00:00
{"pretty_name": "Evaluation run of ziqingyang/chinese-alpaca-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ziqingyang/chinese-alpaca-2-7b](https://huggingface.co/ziqingyang/chinese-alpaca-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T20:52:04.349396](https://huggingface.co/datasets/open-llm-leaderboard/details_ziqingyang__chinese-alpaca-2-7b/blob/main/results_2023-10-16T20-52-04.349396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31008808724832215,\n \"em_stderr\": 0.00473673419159097,\n \"f1\": 0.3665834731543636,\n \"f1_stderr\": 0.004667666858281671,\n \"acc\": 0.37884916762058746,\n \"acc_stderr\": 0.009648470423229465\n },\n \"harness|drop|3\": {\n \"em\": 0.31008808724832215,\n \"em_stderr\": 0.00473673419159097,\n \"f1\": 0.3665834731543636,\n \"f1_stderr\": 0.004667666858281671\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0576194086429113,\n \"acc_stderr\": 0.006418593319822861\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7000789265982637,\n \"acc_stderr\": 0.01287834752663607\n }\n}\n```", "repo_url": "https://huggingface.co/ziqingyang/chinese-alpaca-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T20_52_04.349396", "path": ["**/details_harness|drop|3_2023-10-16T20-52-04.349396.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T20-52-04.349396.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T20_52_04.349396", "path": ["**/details_harness|gsm8k|5_2023-10-16T20-52-04.349396.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T20-52-04.349396.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:39:32.814142.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:39:32.814142.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:39:32.814142.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T20_52_04.349396", "path": ["**/details_harness|winogrande|5_2023-10-16T20-52-04.349396.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T20-52-04.349396.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_39_32.814142", "path": ["results_2023-08-09T11:39:32.814142.parquet"]}, {"split": "2023_10_16T20_52_04.349396", "path": ["results_2023-10-16T20-52-04.349396.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T20-52-04.349396.parquet"]}]}]}
2023-10-16T19:52:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ziqingyang/chinese-alpaca-2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ziqingyang/chinese-alpaca-2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T20:52:04.349396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ziqingyang/chinese-alpaca-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ziqingyang/chinese-alpaca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T20:52:04.349396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ziqingyang/chinese-alpaca-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ziqingyang/chinese-alpaca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T20:52:04.349396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ziqingyang/chinese-alpaca-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ziqingyang/chinese-alpaca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T20:52:04.349396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7b8355403bfd1efbc28cf77fe468adf7d8894212
# Dataset Card for Evaluation run of ziqingyang/chinese-llama-2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ziqingyang/chinese-llama-2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ziqingyang/chinese-llama-2-7b](https://huggingface.co/ziqingyang/chinese-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ziqingyang__chinese-llama-2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T16:11:09.467879](https://huggingface.co/datasets/open-llm-leaderboard/details_ziqingyang__chinese-llama-2-7b/blob/main/results_2023-10-17T16-11-09.467879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.008703859060402684, "em_stderr": 0.0009512557261398741, "f1": 0.08773280201342261, "f1_stderr": 0.0016822920336997918, "acc": 0.35211166049236076, "acc_stderr": 0.008141255030998417 }, "harness|drop|3": { "em": 0.008703859060402684, "em_stderr": 0.0009512557261398741, "f1": 0.08773280201342261, "f1_stderr": 0.0016822920336997918 }, "harness|gsm8k|5": { "acc": 0.014404852160727824, "acc_stderr": 0.0032820559171369344 }, "harness|winogrande|5": { "acc": 0.6898184688239937, "acc_stderr": 0.0130004541448599 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ziqingyang__chinese-llama-2-7b
[ "region:us" ]
2023-08-18T10:30:29+00:00
{"pretty_name": "Evaluation run of ziqingyang/chinese-llama-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ziqingyang/chinese-llama-2-7b](https://huggingface.co/ziqingyang/chinese-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ziqingyang__chinese-llama-2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T16:11:09.467879](https://huggingface.co/datasets/open-llm-leaderboard/details_ziqingyang__chinese-llama-2-7b/blob/main/results_2023-10-17T16-11-09.467879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008703859060402684,\n \"em_stderr\": 0.0009512557261398741,\n \"f1\": 0.08773280201342261,\n \"f1_stderr\": 0.0016822920336997918,\n \"acc\": 0.35211166049236076,\n \"acc_stderr\": 0.008141255030998417\n },\n \"harness|drop|3\": {\n \"em\": 0.008703859060402684,\n \"em_stderr\": 0.0009512557261398741,\n \"f1\": 0.08773280201342261,\n \"f1_stderr\": 0.0016822920336997918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.0032820559171369344\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6898184688239937,\n \"acc_stderr\": 0.0130004541448599\n }\n}\n```", "repo_url": "https://huggingface.co/ziqingyang/chinese-llama-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T16_11_09.467879", "path": ["**/details_harness|drop|3_2023-10-17T16-11-09.467879.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T16-11-09.467879.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T16_11_09.467879", "path": ["**/details_harness|gsm8k|5_2023-10-17T16-11-09.467879.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T16-11-09.467879.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:36:32.525773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:36:32.525773.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:36:32.525773.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T16_11_09.467879", "path": ["**/details_harness|winogrande|5_2023-10-17T16-11-09.467879.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T16-11-09.467879.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_36_32.525773", "path": ["results_2023-08-09T11:36:32.525773.parquet"]}, {"split": "2023_10_17T16_11_09.467879", "path": ["results_2023-10-17T16-11-09.467879.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T16-11-09.467879.parquet"]}]}]}
2023-10-17T15:11:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ziqingyang/chinese-llama-2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ziqingyang/chinese-llama-2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T16:11:09.467879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ziqingyang/chinese-llama-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ziqingyang/chinese-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T16:11:09.467879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ziqingyang/chinese-llama-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ziqingyang/chinese-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T16:11:09.467879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ziqingyang/chinese-llama-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ziqingyang/chinese-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T16:11:09.467879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c27e6d9222b2ad6728be2d0ea3f196c7573a8423
# Dataset Card for Evaluation run of timdettmers/guanaco-33b-merged ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/timdettmers/guanaco-33b-merged - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [timdettmers/guanaco-33b-merged](https://huggingface.co/timdettmers/guanaco-33b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_timdettmers__guanaco-33b-merged", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-20T14:01:44.026263](https://huggingface.co/datasets/open-llm-leaderboard/details_timdettmers__guanaco-33b-merged/blob/main/results_2023-07-20T14%3A01%3A44.026263.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5404553018205109, "acc_stderr": 0.03488622237927161, "acc_norm": 0.5444824613318672, "acc_norm_stderr": 0.03486249375448495, "mc1": 0.34761321909424725, "mc1_stderr": 0.016670769188897306, "mc2": 0.5121992740888713, "mc2_stderr": 0.014650490351006002 }, "harness|arc:challenge|25": { "acc": 0.5870307167235495, "acc_stderr": 0.014388344935398326, "acc_norm": 0.6245733788395904, "acc_norm_stderr": 0.014150631435111726 }, "harness|hellaswag|10": { "acc": 0.6446922923720374, "acc_stderr": 0.004776283203468098, "acc_norm": 0.8447520414260108, "acc_norm_stderr": 0.003614007841341989 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.48148148148148145, "acc_stderr": 0.043163785995113245, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.043163785995113245 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5657894736842105, "acc_stderr": 0.04033565667848319, "acc_norm": 0.5657894736842105, "acc_norm_stderr": 0.04033565667848319 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5622641509433962, "acc_stderr": 0.030533338430467516, "acc_norm": 0.5622641509433962, "acc_norm_stderr": 0.030533338430467516 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5625, "acc_stderr": 0.04148415739394154, "acc_norm": 0.5625, "acc_norm_stderr": 0.04148415739394154 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383889, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383889 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4595744680851064, "acc_stderr": 0.03257901482099835, "acc_norm": 0.4595744680851064, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3684210526315789, "acc_stderr": 0.04537815354939391, "acc_norm": 0.3684210526315789, "acc_norm_stderr": 0.04537815354939391 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4413793103448276, "acc_stderr": 0.04137931034482758, "acc_norm": 0.4413793103448276, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.31216931216931215, "acc_stderr": 0.0238652068369726, "acc_norm": 0.31216931216931215, "acc_norm_stderr": 0.0238652068369726 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147126, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147126 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6290322580645161, "acc_stderr": 0.027480541887953593, "acc_norm": 0.6290322580645161, "acc_norm_stderr": 0.027480541887953593 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.03413963805906235, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.03413963805906235 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7212121212121212, "acc_stderr": 0.035014387062967806, "acc_norm": 0.7212121212121212, "acc_norm_stderr": 0.035014387062967806 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03191178226713547, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03191178226713547 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7150259067357513, "acc_stderr": 0.032577140777096614, "acc_norm": 0.7150259067357513, "acc_norm_stderr": 0.032577140777096614 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.49230769230769234, "acc_stderr": 0.025348006031534778, "acc_norm": 0.49230769230769234, "acc_norm_stderr": 0.025348006031534778 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.02730914058823019, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.02730914058823019 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5294117647058824, "acc_stderr": 0.03242225027115006, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.03242225027115006 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7302752293577982, "acc_stderr": 0.019028486711115438, "acc_norm": 0.7302752293577982, "acc_norm_stderr": 0.019028486711115438 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.375, "acc_stderr": 0.033016908987210894, "acc_norm": 0.375, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967409, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967409 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.02747974455080851, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.02747974455080851 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5874439461883408, "acc_stderr": 0.03304062175449297, "acc_norm": 0.5874439461883408, "acc_norm_stderr": 0.03304062175449297 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6412213740458015, "acc_stderr": 0.04206739313864908, "acc_norm": 0.6412213740458015, "acc_norm_stderr": 0.04206739313864908 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04712821257426769, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04712821257426769 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028546, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028546 }, "harness|hendrycksTest-management|5": { "acc": 0.6310679611650486, "acc_stderr": 0.0477761518115674, "acc_norm": 0.6310679611650486, "acc_norm_stderr": 0.0477761518115674 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7863247863247863, "acc_stderr": 0.026853450377009154, "acc_norm": 0.7863247863247863, "acc_norm_stderr": 0.026853450377009154 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6756066411238825, "acc_stderr": 0.0167409290471627, "acc_norm": 0.6756066411238825, "acc_norm_stderr": 0.0167409290471627 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5491329479768786, "acc_stderr": 0.026788811931562757, "acc_norm": 0.5491329479768786, "acc_norm_stderr": 0.026788811931562757 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2569832402234637, "acc_stderr": 0.01461446582196632, "acc_norm": 0.2569832402234637, "acc_norm_stderr": 0.01461446582196632 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5718954248366013, "acc_stderr": 0.028332397483664278, "acc_norm": 0.5718954248366013, "acc_norm_stderr": 0.028332397483664278 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6012861736334405, "acc_stderr": 0.027809322585774496, "acc_norm": 0.6012861736334405, "acc_norm_stderr": 0.027809322585774496 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6172839506172839, "acc_stderr": 0.027044538138402595, "acc_norm": 0.6172839506172839, "acc_norm_stderr": 0.027044538138402595 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.42907801418439717, "acc_stderr": 0.029525914302558555, "acc_norm": 0.42907801418439717, "acc_norm_stderr": 0.029525914302558555 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.424380704041721, "acc_stderr": 0.01262334375743002, "acc_norm": 0.424380704041721, "acc_norm_stderr": 0.01262334375743002 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5808823529411765, "acc_stderr": 0.02997280717046462, "acc_norm": 0.5808823529411765, "acc_norm_stderr": 0.02997280717046462 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5212418300653595, "acc_stderr": 0.020209572388600248, "acc_norm": 0.5212418300653595, "acc_norm_stderr": 0.020209572388600248 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5918367346938775, "acc_stderr": 0.03146465712827424, "acc_norm": 0.5918367346938775, "acc_norm_stderr": 0.03146465712827424 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6915422885572139, "acc_stderr": 0.032658195885126966, "acc_norm": 0.6915422885572139, "acc_norm_stderr": 0.032658195885126966 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.78, "acc_stderr": 0.041633319989322626, "acc_norm": 0.78, "acc_norm_stderr": 0.041633319989322626 }, "harness|hendrycksTest-virology|5": { "acc": 0.4578313253012048, "acc_stderr": 0.038786267710023595, "acc_norm": 0.4578313253012048, "acc_norm_stderr": 0.038786267710023595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7251461988304093, "acc_stderr": 0.03424042924691584, "acc_norm": 0.7251461988304093, "acc_norm_stderr": 0.03424042924691584 }, "harness|truthfulqa:mc|0": { "mc1": 0.34761321909424725, "mc1_stderr": 0.016670769188897306, "mc2": 0.5121992740888713, "mc2_stderr": 0.014650490351006002 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_timdettmers__guanaco-33b-merged
[ "region:us" ]
2023-08-18T10:30:37+00:00
{"pretty_name": "Evaluation run of timdettmers/guanaco-33b-merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [timdettmers/guanaco-33b-merged](https://huggingface.co/timdettmers/guanaco-33b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_timdettmers__guanaco-33b-merged\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-20T14:01:44.026263](https://huggingface.co/datasets/open-llm-leaderboard/details_timdettmers__guanaco-33b-merged/blob/main/results_2023-07-20T14%3A01%3A44.026263.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5404553018205109,\n \"acc_stderr\": 0.03488622237927161,\n \"acc_norm\": 0.5444824613318672,\n \"acc_norm_stderr\": 0.03486249375448495,\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897306,\n \"mc2\": 0.5121992740888713,\n \"mc2_stderr\": 0.014650490351006002\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111726\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6446922923720374,\n \"acc_stderr\": 0.004776283203468098,\n \"acc_norm\": 0.8447520414260108,\n \"acc_norm_stderr\": 0.003614007841341989\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467516,\n \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467516\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31216931216931215,\n \"acc_stderr\": 0.0238652068369726,\n \"acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.0238652068369726\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.025348006031534778,\n \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.025348006031534778\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823019,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823019\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115006,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115006\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7302752293577982,\n \"acc_stderr\": 0.019028486711115438,\n \"acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.019028486711115438\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967409,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.7863247863247863,\n \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n \"acc_stderr\": 0.0167409290471627,\n \"acc_norm\": 0.6756066411238825,\n \"acc_norm_stderr\": 0.0167409290471627\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.026788811931562757,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.026788811931562757\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n \"acc_stderr\": 0.01461446582196632,\n \"acc_norm\": 0.2569832402234637,\n \"acc_norm_stderr\": 0.01461446582196632\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664278,\n \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664278\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n \"acc_stderr\": 0.027809322585774496,\n \"acc_norm\": 0.6012861736334405,\n \"acc_norm_stderr\": 0.027809322585774496\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402595,\n \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402595\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n \"acc_stderr\": 0.01262334375743002,\n \"acc_norm\": 0.424380704041721,\n \"acc_norm_stderr\": 0.01262334375743002\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.02997280717046462,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.02997280717046462\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5212418300653595,\n \"acc_stderr\": 0.020209572388600248,\n \"acc_norm\": 0.5212418300653595,\n \"acc_norm_stderr\": 0.020209572388600248\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897306,\n \"mc2\": 0.5121992740888713,\n \"mc2_stderr\": 0.014650490351006002\n }\n}\n```", "repo_url": "https://huggingface.co/timdettmers/guanaco-33b-merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|arc:challenge|25_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hellaswag|10_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:44.026263.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T14:01:44.026263.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T14_01_44.026263", "path": ["results_2023-07-20T14:01:44.026263.parquet"]}, {"split": "latest", "path": ["results_2023-07-20T14:01:44.026263.parquet"]}]}]}
2023-08-27T11:34:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of timdettmers/guanaco-33b-merged ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model timdettmers/guanaco-33b-merged on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-20T14:01:44.026263 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of timdettmers/guanaco-33b-merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model timdettmers/guanaco-33b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-20T14:01:44.026263 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of timdettmers/guanaco-33b-merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model timdettmers/guanaco-33b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-20T14:01:44.026263 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of timdettmers/guanaco-33b-merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model timdettmers/guanaco-33b-merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-20T14:01:44.026263 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
aa63ada8ba2ee62cee7b7b47c06f404ff951a508
# Dataset Card for Evaluation run of klosax/pythia-70m-deduped-step44k-92bt ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/klosax/pythia-70m-deduped-step44k-92bt - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [klosax/pythia-70m-deduped-step44k-92bt](https://huggingface.co/klosax/pythia-70m-deduped-step44k-92bt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_klosax__pythia-70m-deduped-step44k-92bt", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T19:22:51.930931](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__pythia-70m-deduped-step44k-92bt/blob/main/results_2023-09-16T19-22-51.930931.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0005243288590604027, "em_stderr": 0.000234437804648362, "f1": 0.023688129194630956, "f1_stderr": 0.0008485245166671287, "acc": 0.25769534333070243, "acc_stderr": 0.007022913394891831 }, "harness|drop|3": { "em": 0.0005243288590604027, "em_stderr": 0.000234437804648362, "f1": 0.023688129194630956, "f1_stderr": 0.0008485245166671287 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5153906866614049, "acc_stderr": 0.014045826789783661 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_klosax__pythia-70m-deduped-step44k-92bt
[ "region:us" ]
2023-08-18T10:30:46+00:00
{"pretty_name": "Evaluation run of klosax/pythia-70m-deduped-step44k-92bt", "dataset_summary": "Dataset automatically created during the evaluation run of model [klosax/pythia-70m-deduped-step44k-92bt](https://huggingface.co/klosax/pythia-70m-deduped-step44k-92bt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_klosax__pythia-70m-deduped-step44k-92bt\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T19:22:51.930931](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__pythia-70m-deduped-step44k-92bt/blob/main/results_2023-09-16T19-22-51.930931.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.000234437804648362,\n \"f1\": 0.023688129194630956,\n \"f1_stderr\": 0.0008485245166671287,\n \"acc\": 0.25769534333070243,\n \"acc_stderr\": 0.007022913394891831\n },\n \"harness|drop|3\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.000234437804648362,\n \"f1\": 0.023688129194630956,\n \"f1_stderr\": 0.0008485245166671287\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5153906866614049,\n \"acc_stderr\": 0.014045826789783661\n }\n}\n```", "repo_url": "https://huggingface.co/klosax/pythia-70m-deduped-step44k-92bt", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T19_22_51.930931", "path": ["**/details_harness|drop|3_2023-09-16T19-22-51.930931.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T19-22-51.930931.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T19_22_51.930931", "path": ["**/details_harness|gsm8k|5_2023-09-16T19-22-51.930931.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T19-22-51.930931.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:26:02.759648.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:43:50.721558.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:43:50.721558.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:43:50.721558.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T19_22_51.930931", "path": ["**/details_harness|winogrande|5_2023-09-16T19-22-51.930931.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T19-22-51.930931.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T09_26_02.759648", "path": ["results_2023-07-24T09:26:02.759648.parquet"]}, {"split": "2023_07_24T09_43_50.721558", "path": ["results_2023-07-24T09:43:50.721558.parquet"]}, {"split": "2023_09_16T19_22_51.930931", "path": ["results_2023-09-16T19-22-51.930931.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T19-22-51.930931.parquet"]}]}]}
2023-09-16T18:23:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of klosax/pythia-70m-deduped-step44k-92bt ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model klosax/pythia-70m-deduped-step44k-92bt on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T19:22:51.930931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of klosax/pythia-70m-deduped-step44k-92bt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/pythia-70m-deduped-step44k-92bt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T19:22:51.930931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of klosax/pythia-70m-deduped-step44k-92bt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/pythia-70m-deduped-step44k-92bt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T19:22:51.930931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of klosax/pythia-70m-deduped-step44k-92bt## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/pythia-70m-deduped-step44k-92bt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T19:22:51.930931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0ad6b75351c0234a012656d297dcb147d59a4e20
# Dataset Card for "b-mc2-sql-create-context" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EnigmaOfTheWorld/b-mc2-sql-create-context
[ "region:us" ]
2023-08-18T10:30:51+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 28955156, "num_examples": 78577}], "download_size": 0, "dataset_size": 28955156}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-08-21T03:09:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for "b-mc2-sql-create-context" More Information needed
[ "# Dataset Card for \"b-mc2-sql-create-context\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"b-mc2-sql-create-context\"\n\nMore Information needed" ]
[ 6, 23 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"b-mc2-sql-create-context\"\n\nMore Information needed" ]
d73569b77c409fd6c38c5d1b48f5e40f777b9042
# Dataset Card for Evaluation run of klosax/open_llama_7b_400bt_preview ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/klosax/open_llama_7b_400bt_preview - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [klosax/open_llama_7b_400bt_preview](https://huggingface.co/klosax/open_llama_7b_400bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_klosax__open_llama_7b_400bt_preview", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-24T11:21:09.797599](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_7b_400bt_preview/blob/main/results_2023-07-24T11%3A21%3A09.797599.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2815767296994265, "acc_stderr": 0.03231700614609591, "acc_norm": 0.2849357778261771, "acc_norm_stderr": 0.03231676634268911, "mc1": 0.22888616891064872, "mc1_stderr": 0.014706994909055027, "mc2": 0.360402821143995, "mc2_stderr": 0.013409179932482647 }, "harness|arc:challenge|25": { "acc": 0.36177474402730375, "acc_stderr": 0.014041957945038064, "acc_norm": 0.39505119453924914, "acc_norm_stderr": 0.014285898292938169 }, "harness|hellaswag|10": { "acc": 0.4939255128460466, "acc_stderr": 0.004989413158034797, "acc_norm": 0.658832901812388, "acc_norm_stderr": 0.004731324409133264 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3111111111111111, "acc_stderr": 0.03999262876617722, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.03999262876617722 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.28289473684210525, "acc_stderr": 0.03665349695640767, "acc_norm": 0.28289473684210525, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2981132075471698, "acc_stderr": 0.028152837942493857, "acc_norm": 0.2981132075471698, "acc_norm_stderr": 0.028152837942493857 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.04793724854411021, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411021 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2543352601156069, "acc_stderr": 0.0332055644308557, "acc_norm": 0.2543352601156069, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237655, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237655 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.19574468085106383, "acc_stderr": 0.025937853139977148, "acc_norm": 0.19574468085106383, "acc_norm_stderr": 0.025937853139977148 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813344, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813344 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.296551724137931, "acc_stderr": 0.03806142687309993, "acc_norm": 0.296551724137931, "acc_norm_stderr": 0.03806142687309993 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2671957671957672, "acc_stderr": 0.02278967314577656, "acc_norm": 0.2671957671957672, "acc_norm_stderr": 0.02278967314577656 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.16666666666666666, "acc_stderr": 0.03333333333333337, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.03333333333333337 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25161290322580643, "acc_stderr": 0.024685979286239956, "acc_norm": 0.25161290322580643, "acc_norm_stderr": 0.024685979286239956 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.28078817733990147, "acc_stderr": 0.03161856335358609, "acc_norm": 0.28078817733990147, "acc_norm_stderr": 0.03161856335358609 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.26666666666666666, "acc_stderr": 0.03453131801885415, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3434343434343434, "acc_stderr": 0.033832012232444426, "acc_norm": 0.3434343434343434, "acc_norm_stderr": 0.033832012232444426 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.36787564766839376, "acc_stderr": 0.03480175668466036, "acc_norm": 0.36787564766839376, "acc_norm_stderr": 0.03480175668466036 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.358974358974359, "acc_stderr": 0.024321738484602357, "acc_norm": 0.358974358974359, "acc_norm_stderr": 0.024321738484602357 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.02684205787383371, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.02684205787383371 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2582781456953642, "acc_stderr": 0.035737053147634576, "acc_norm": 0.2582781456953642, "acc_norm_stderr": 0.035737053147634576 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3486238532110092, "acc_stderr": 0.020431254090714328, "acc_norm": 0.3486238532110092, "acc_norm_stderr": 0.020431254090714328 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.033851779760448106, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.033851779760448106 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.27941176470588236, "acc_stderr": 0.031493281045079556, "acc_norm": 0.27941176470588236, "acc_norm_stderr": 0.031493281045079556 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.25316455696202533, "acc_stderr": 0.028304657943035303, "acc_norm": 0.25316455696202533, "acc_norm_stderr": 0.028304657943035303 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.13901345291479822, "acc_stderr": 0.023219352834474464, "acc_norm": 0.13901345291479822, "acc_norm_stderr": 0.023219352834474464 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2824427480916031, "acc_stderr": 0.03948406125768361, "acc_norm": 0.2824427480916031, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.38016528925619836, "acc_stderr": 0.04431324501968432, "acc_norm": 0.38016528925619836, "acc_norm_stderr": 0.04431324501968432 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.21296296296296297, "acc_stderr": 0.039578354719809805, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.039578354719809805 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.3006134969325153, "acc_stderr": 0.03602511318806771, "acc_norm": 0.3006134969325153, "acc_norm_stderr": 0.03602511318806771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.16071428571428573, "acc_stderr": 0.03485946096475741, "acc_norm": 0.16071428571428573, "acc_norm_stderr": 0.03485946096475741 }, "harness|hendrycksTest-management|5": { "acc": 0.3786407766990291, "acc_stderr": 0.04802694698258972, "acc_norm": 0.3786407766990291, "acc_norm_stderr": 0.04802694698258972 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.02860595370200425, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.02860595370200425 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.2, "acc_stderr": 0.040201512610368445, "acc_norm": 0.2, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.20434227330779056, "acc_stderr": 0.0144191239809319, "acc_norm": 0.20434227330779056, "acc_norm_stderr": 0.0144191239809319 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.29190751445086704, "acc_stderr": 0.02447699407624734, "acc_norm": 0.29190751445086704, "acc_norm_stderr": 0.02447699407624734 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2549019607843137, "acc_stderr": 0.02495418432487991, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2797427652733119, "acc_stderr": 0.02549425935069489, "acc_norm": 0.2797427652733119, "acc_norm_stderr": 0.02549425935069489 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.22530864197530864, "acc_stderr": 0.023246202647819746, "acc_norm": 0.22530864197530864, "acc_norm_stderr": 0.023246202647819746 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2695035460992908, "acc_stderr": 0.026469036818590638, "acc_norm": 0.2695035460992908, "acc_norm_stderr": 0.026469036818590638 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.26597131681877445, "acc_stderr": 0.011285033165551269, "acc_norm": 0.26597131681877445, "acc_norm_stderr": 0.011285033165551269 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.030211479609121593, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.030211479609121593 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.22549019607843138, "acc_stderr": 0.016906615927288145, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.016906615927288145 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.20909090909090908, "acc_stderr": 0.038950910157241364, "acc_norm": 0.20909090909090908, "acc_norm_stderr": 0.038950910157241364 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4, "acc_stderr": 0.031362502409358936, "acc_norm": 0.4, "acc_norm_stderr": 0.031362502409358936 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24875621890547264, "acc_stderr": 0.030567675938916707, "acc_norm": 0.24875621890547264, "acc_norm_stderr": 0.030567675938916707 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-virology|5": { "acc": 0.1927710843373494, "acc_stderr": 0.030709824050565274, "acc_norm": 0.1927710843373494, "acc_norm_stderr": 0.030709824050565274 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.29239766081871343, "acc_stderr": 0.034886477134579215, "acc_norm": 0.29239766081871343, "acc_norm_stderr": 0.034886477134579215 }, "harness|truthfulqa:mc|0": { "mc1": 0.22888616891064872, "mc1_stderr": 0.014706994909055027, "mc2": 0.360402821143995, "mc2_stderr": 0.013409179932482647 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_klosax__open_llama_7b_400bt_preview
[ "region:us" ]
2023-08-18T10:30:56+00:00
{"pretty_name": "Evaluation run of klosax/open_llama_7b_400bt_preview", "dataset_summary": "Dataset automatically created during the evaluation run of model [klosax/open_llama_7b_400bt_preview](https://huggingface.co/klosax/open_llama_7b_400bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_klosax__open_llama_7b_400bt_preview\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-24T11:21:09.797599](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_7b_400bt_preview/blob/main/results_2023-07-24T11%3A21%3A09.797599.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2815767296994265,\n \"acc_stderr\": 0.03231700614609591,\n \"acc_norm\": 0.2849357778261771,\n \"acc_norm_stderr\": 0.03231676634268911,\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.360402821143995,\n \"mc2_stderr\": 0.013409179932482647\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36177474402730375,\n \"acc_stderr\": 0.014041957945038064,\n \"acc_norm\": 0.39505119453924914,\n \"acc_norm_stderr\": 0.014285898292938169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4939255128460466,\n \"acc_stderr\": 0.004989413158034797,\n \"acc_norm\": 0.658832901812388,\n \"acc_norm_stderr\": 0.004731324409133264\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.19574468085106383,\n \"acc_stderr\": 0.025937853139977148,\n \"acc_norm\": 0.19574468085106383,\n \"acc_norm_stderr\": 0.025937853139977148\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3434343434343434,\n \"acc_stderr\": 0.033832012232444426,\n \"acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.033832012232444426\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13901345291479822,\n \"acc_stderr\": 0.023219352834474464,\n \"acc_norm\": 0.13901345291479822,\n \"acc_norm_stderr\": 0.023219352834474464\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n \"acc_stderr\": 0.02549425935069489,\n \"acc_norm\": 0.2797427652733119,\n \"acc_norm_stderr\": 0.02549425935069489\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n \"acc_stderr\": 0.011285033165551269,\n \"acc_norm\": 0.26597131681877445,\n \"acc_norm_stderr\": 0.011285033165551269\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.016906615927288145,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.016906615927288145\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.360402821143995,\n \"mc2_stderr\": 0.013409179932482647\n }\n}\n```", "repo_url": "https://huggingface.co/klosax/open_llama_7b_400bt_preview", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:21:09.797599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:21:09.797599.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_21_09.797599", "path": ["results_2023-07-24T11:21:09.797599.parquet"]}, {"split": "latest", "path": ["results_2023-07-24T11:21:09.797599.parquet"]}]}]}
2023-08-27T11:34:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of klosax/open_llama_7b_400bt_preview ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model klosax/open_llama_7b_400bt_preview on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-24T11:21:09.797599 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of klosax/open_llama_7b_400bt_preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_7b_400bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T11:21:09.797599 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of klosax/open_llama_7b_400bt_preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_7b_400bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T11:21:09.797599 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of klosax/open_llama_7b_400bt_preview## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_7b_400bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-24T11:21:09.797599 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3331f645b84633f51a9c4c23025a5ebb2f4c1482
# Dataset Card for Evaluation run of klosax/open_llama_13b_600bt_preview ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/klosax/open_llama_13b_600bt_preview - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [klosax/open_llama_13b_600bt_preview](https://huggingface.co/klosax/open_llama_13b_600bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T03:30:54.296590](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview/blob/main/results_2023-10-13T03-30-54.296590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0007340604026845638, "em_stderr": 0.0002773614457335577, "f1": 0.05345952181208074, "f1_stderr": 0.0013179175979171436, "acc": 0.35200275495116307, "acc_stderr": 0.00844603886086826 }, "harness|drop|3": { "em": 0.0007340604026845638, "em_stderr": 0.0002773614457335577, "f1": 0.05345952181208074, "f1_stderr": 0.0013179175979171436 }, "harness|gsm8k|5": { "acc": 0.019711902956785442, "acc_stderr": 0.0038289829787357095 }, "harness|winogrande|5": { "acc": 0.6842936069455406, "acc_stderr": 0.01306309474300081 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview
[ "region:us" ]
2023-08-18T10:31:06+00:00
{"pretty_name": "Evaluation run of klosax/open_llama_13b_600bt_preview", "dataset_summary": "Dataset automatically created during the evaluation run of model [klosax/open_llama_13b_600bt_preview](https://huggingface.co/klosax/open_llama_13b_600bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T03:30:54.296590](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_13b_600bt_preview/blob/main/results_2023-10-13T03-30-54.296590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335577,\n \"f1\": 0.05345952181208074,\n \"f1_stderr\": 0.0013179175979171436,\n \"acc\": 0.35200275495116307,\n \"acc_stderr\": 0.00844603886086826\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335577,\n \"f1\": 0.05345952181208074,\n \"f1_stderr\": 0.0013179175979171436\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.0038289829787357095\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6842936069455406,\n \"acc_stderr\": 0.01306309474300081\n }\n}\n```", "repo_url": "https://huggingface.co/klosax/open_llama_13b_600bt_preview", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T03_30_54.296590", "path": ["**/details_harness|drop|3_2023-10-13T03-30-54.296590.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T03-30-54.296590.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T03_30_54.296590", "path": ["**/details_harness|gsm8k|5_2023-10-13T03-30-54.296590.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T03-30-54.296590.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:16:53.504073.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:16:53.504073.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T03_30_54.296590", "path": ["**/details_harness|winogrande|5_2023-10-13T03-30-54.296590.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T03-30-54.296590.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T13_16_53.504073", "path": ["results_2023-07-24T13:16:53.504073.parquet"]}, {"split": "2023_10_13T03_30_54.296590", "path": ["results_2023-10-13T03-30-54.296590.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T03-30-54.296590.parquet"]}]}]}
2023-10-13T02:31:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of klosax/open_llama_13b_600bt_preview ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model klosax/open_llama_13b_600bt_preview on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T03:30:54.296590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of klosax/open_llama_13b_600bt_preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_13b_600bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T03:30:54.296590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of klosax/open_llama_13b_600bt_preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_13b_600bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T03:30:54.296590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of klosax/open_llama_13b_600bt_preview## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_13b_600bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T03:30:54.296590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
04893a959cca8fe9cb9fd505d72805e8065b1781
# Dataset Card for Evaluation run of klosax/pythia-160m-deduped-step92k-193bt ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/klosax/pythia-160m-deduped-step92k-193bt - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [klosax/pythia-160m-deduped-step92k-193bt](https://huggingface.co/klosax/pythia-160m-deduped-step92k-193bt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_klosax__pythia-160m-deduped-step92k-193bt", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T15:14:54.086566](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__pythia-160m-deduped-step92k-193bt/blob/main/results_2023-10-14T15-14-54.086566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001572986577181208, "em_stderr": 0.00040584511324177414, "f1": 0.03547189597315449, "f1_stderr": 0.0010733187820994201, "acc": 0.2560390216931461, "acc_stderr": 0.007871628031487199 }, "harness|drop|3": { "em": 0.001572986577181208, "em_stderr": 0.00040584511324177414, "f1": 0.03547189597315449, "f1_stderr": 0.0010733187820994201 }, "harness|gsm8k|5": { "acc": 0.0037907505686125853, "acc_stderr": 0.0016927007401502023 }, "harness|winogrande|5": { "acc": 0.5082872928176796, "acc_stderr": 0.014050555322824194 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_klosax__pythia-160m-deduped-step92k-193bt
[ "region:us" ]
2023-08-18T10:31:15+00:00
{"pretty_name": "Evaluation run of klosax/pythia-160m-deduped-step92k-193bt", "dataset_summary": "Dataset automatically created during the evaluation run of model [klosax/pythia-160m-deduped-step92k-193bt](https://huggingface.co/klosax/pythia-160m-deduped-step92k-193bt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_klosax__pythia-160m-deduped-step92k-193bt\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T15:14:54.086566](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__pythia-160m-deduped-step92k-193bt/blob/main/results_2023-10-14T15-14-54.086566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177414,\n \"f1\": 0.03547189597315449,\n \"f1_stderr\": 0.0010733187820994201,\n \"acc\": 0.2560390216931461,\n \"acc_stderr\": 0.007871628031487199\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177414,\n \"f1\": 0.03547189597315449,\n \"f1_stderr\": 0.0010733187820994201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401502023\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5082872928176796,\n \"acc_stderr\": 0.014050555322824194\n }\n}\n```", "repo_url": "https://huggingface.co/klosax/pythia-160m-deduped-step92k-193bt", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T15_14_54.086566", "path": ["**/details_harness|drop|3_2023-10-14T15-14-54.086566.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T15-14-54.086566.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T15_14_54.086566", "path": ["**/details_harness|gsm8k|5_2023-10-14T15-14-54.086566.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T15-14-54.086566.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:50:00.189270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:50:00.189270.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:50:00.189270.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T15_14_54.086566", "path": ["**/details_harness|winogrande|5_2023-10-14T15-14-54.086566.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T15-14-54.086566.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T09_50_00.189270", "path": ["results_2023-07-24T09:50:00.189270.parquet"]}, {"split": "2023_10_14T15_14_54.086566", "path": ["results_2023-10-14T15-14-54.086566.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T15-14-54.086566.parquet"]}]}]}
2023-10-14T14:15:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of klosax/pythia-160m-deduped-step92k-193bt ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model klosax/pythia-160m-deduped-step92k-193bt on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T15:14:54.086566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of klosax/pythia-160m-deduped-step92k-193bt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/pythia-160m-deduped-step92k-193bt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T15:14:54.086566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of klosax/pythia-160m-deduped-step92k-193bt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/pythia-160m-deduped-step92k-193bt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T15:14:54.086566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of klosax/pythia-160m-deduped-step92k-193bt## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/pythia-160m-deduped-step92k-193bt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T15:14:54.086566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
59262fc4c37ecbdf3a752369a0efef05fdcb017c
# Dataset Card for Evaluation run of klosax/open_llama_3b_350bt_preview ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/klosax/open_llama_3b_350bt_preview - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [klosax/open_llama_3b_350bt_preview](https://huggingface.co/klosax/open_llama_3b_350bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_klosax__open_llama_3b_350bt_preview", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-24T10:25:13.548749](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_3b_350bt_preview/blob/main/results_2023-07-24T10%3A25%3A13.548749.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.27221888756913304, "acc_stderr": 0.03212813724268037, "acc_norm": 0.2751900409341744, "acc_norm_stderr": 0.032129932454657235, "mc1": 0.22031823745410037, "mc1_stderr": 0.01450904517148729, "mc2": 0.35027279444600373, "mc2_stderr": 0.01335009503768823 }, "harness|arc:challenge|25": { "acc": 0.34215017064846415, "acc_stderr": 0.01386415215917728, "acc_norm": 0.3651877133105802, "acc_norm_stderr": 0.014070265519268802 }, "harness|hellaswag|10": { "acc": 0.4563831905994822, "acc_stderr": 0.004970759774676886, "acc_norm": 0.6086436964748058, "acc_norm_stderr": 0.004870563921220623 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2740740740740741, "acc_stderr": 0.03853254836552003, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.03853254836552003 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.23026315789473684, "acc_stderr": 0.03426059424403165, "acc_norm": 0.23026315789473684, "acc_norm_stderr": 0.03426059424403165 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.30566037735849055, "acc_stderr": 0.028353298073322666, "acc_norm": 0.30566037735849055, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.22916666666666666, "acc_stderr": 0.03514697467862388, "acc_norm": 0.22916666666666666, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.22, "acc_stderr": 0.0416333199893227, "acc_norm": 0.22, "acc_norm_stderr": 0.0416333199893227 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24277456647398843, "acc_stderr": 0.0326926380614177, "acc_norm": 0.24277456647398843, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.14705882352941177, "acc_stderr": 0.03524068951567447, "acc_norm": 0.14705882352941177, "acc_norm_stderr": 0.03524068951567447 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.33191489361702126, "acc_stderr": 0.03078373675774565, "acc_norm": 0.33191489361702126, "acc_norm_stderr": 0.03078373675774565 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.041857744240220554, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.041857744240220554 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2620689655172414, "acc_stderr": 0.03664666337225256, "acc_norm": 0.2620689655172414, "acc_norm_stderr": 0.03664666337225256 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25396825396825395, "acc_stderr": 0.02241804289111394, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.02241804289111394 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.14285714285714285, "acc_stderr": 0.03129843185743811, "acc_norm": 0.14285714285714285, "acc_norm_stderr": 0.03129843185743811 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.27419354838709675, "acc_stderr": 0.025378139970885196, "acc_norm": 0.27419354838709675, "acc_norm_stderr": 0.025378139970885196 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2955665024630542, "acc_stderr": 0.032104944337514575, "acc_norm": 0.2955665024630542, "acc_norm_stderr": 0.032104944337514575 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23030303030303031, "acc_stderr": 0.032876667586034886, "acc_norm": 0.23030303030303031, "acc_norm_stderr": 0.032876667586034886 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3383838383838384, "acc_stderr": 0.03371124142626304, "acc_norm": 0.3383838383838384, "acc_norm_stderr": 0.03371124142626304 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.27979274611398963, "acc_stderr": 0.032396370467357036, "acc_norm": 0.27979274611398963, "acc_norm_stderr": 0.032396370467357036 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.26153846153846155, "acc_stderr": 0.022282141204204416, "acc_norm": 0.26153846153846155, "acc_norm_stderr": 0.022282141204204416 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275798, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275798 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2857142857142857, "acc_stderr": 0.02934457250063434, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.02934457250063434 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.28807339449541286, "acc_stderr": 0.01941644589263603, "acc_norm": 0.28807339449541286, "acc_norm_stderr": 0.01941644589263603 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4027777777777778, "acc_stderr": 0.03344887382997866, "acc_norm": 0.4027777777777778, "acc_norm_stderr": 0.03344887382997866 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.22549019607843138, "acc_stderr": 0.02933116229425173, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.02933116229425173 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.28270042194092826, "acc_stderr": 0.02931281415395594, "acc_norm": 0.28270042194092826, "acc_norm_stderr": 0.02931281415395594 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.27802690582959644, "acc_stderr": 0.03006958487449403, "acc_norm": 0.27802690582959644, "acc_norm_stderr": 0.03006958487449403 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.1984732824427481, "acc_stderr": 0.034981493854624734, "acc_norm": 0.1984732824427481, "acc_norm_stderr": 0.034981493854624734 }, "harness|hendrycksTest-international_law|5": { "acc": 0.36363636363636365, "acc_stderr": 0.043913262867240704, "acc_norm": 0.36363636363636365, "acc_norm_stderr": 0.043913262867240704 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2962962962962963, "acc_stderr": 0.04414343666854933, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.04414343666854933 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.27607361963190186, "acc_stderr": 0.0351238528370505, "acc_norm": 0.27607361963190186, "acc_norm_stderr": 0.0351238528370505 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.25892857142857145, "acc_stderr": 0.04157751539865629, "acc_norm": 0.25892857142857145, "acc_norm_stderr": 0.04157751539865629 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690877, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690877 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2564102564102564, "acc_stderr": 0.028605953702004253, "acc_norm": 0.2564102564102564, "acc_norm_stderr": 0.028605953702004253 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.044084400227680794, "acc_norm": 0.26, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2835249042145594, "acc_stderr": 0.016117318166832283, "acc_norm": 0.2835249042145594, "acc_norm_stderr": 0.016117318166832283 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2832369942196532, "acc_stderr": 0.02425790170532337, "acc_norm": 0.2832369942196532, "acc_norm_stderr": 0.02425790170532337 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2424581005586592, "acc_stderr": 0.014333522059217889, "acc_norm": 0.2424581005586592, "acc_norm_stderr": 0.014333522059217889 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.2581699346405229, "acc_stderr": 0.025058503316958154, "acc_norm": 0.2581699346405229, "acc_norm_stderr": 0.025058503316958154 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.27009646302250806, "acc_stderr": 0.025218040373410626, "acc_norm": 0.27009646302250806, "acc_norm_stderr": 0.025218040373410626 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.23148148148148148, "acc_stderr": 0.023468429832451166, "acc_norm": 0.23148148148148148, "acc_norm_stderr": 0.023468429832451166 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2695035460992908, "acc_stderr": 0.026469036818590638, "acc_norm": 0.2695035460992908, "acc_norm_stderr": 0.026469036818590638 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.22816166883963493, "acc_stderr": 0.01071799219204788, "acc_norm": 0.22816166883963493, "acc_norm_stderr": 0.01071799219204788 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3161764705882353, "acc_stderr": 0.028245687391462916, "acc_norm": 0.3161764705882353, "acc_norm_stderr": 0.028245687391462916 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.27124183006535946, "acc_stderr": 0.01798661530403031, "acc_norm": 0.27124183006535946, "acc_norm_stderr": 0.01798661530403031 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.35454545454545455, "acc_stderr": 0.04582004841505416, "acc_norm": 0.35454545454545455, "acc_norm_stderr": 0.04582004841505416 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3020408163265306, "acc_stderr": 0.02939360931987981, "acc_norm": 0.3020408163265306, "acc_norm_stderr": 0.02939360931987981 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24875621890547264, "acc_stderr": 0.030567675938916714, "acc_norm": 0.24875621890547264, "acc_norm_stderr": 0.030567675938916714 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.21, "acc_stderr": 0.04093601807403325, "acc_norm": 0.21, "acc_norm_stderr": 0.04093601807403325 }, "harness|hendrycksTest-virology|5": { "acc": 0.2469879518072289, "acc_stderr": 0.03357351982064536, "acc_norm": 0.2469879518072289, "acc_norm_stderr": 0.03357351982064536 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2807017543859649, "acc_stderr": 0.034462962170884265, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.22031823745410037, "mc1_stderr": 0.01450904517148729, "mc2": 0.35027279444600373, "mc2_stderr": 0.01335009503768823 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_klosax__open_llama_3b_350bt_preview
[ "region:us" ]
2023-08-18T10:31:24+00:00
{"pretty_name": "Evaluation run of klosax/open_llama_3b_350bt_preview", "dataset_summary": "Dataset automatically created during the evaluation run of model [klosax/open_llama_3b_350bt_preview](https://huggingface.co/klosax/open_llama_3b_350bt_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_klosax__open_llama_3b_350bt_preview\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-24T10:25:13.548749](https://huggingface.co/datasets/open-llm-leaderboard/details_klosax__open_llama_3b_350bt_preview/blob/main/results_2023-07-24T10%3A25%3A13.548749.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27221888756913304,\n \"acc_stderr\": 0.03212813724268037,\n \"acc_norm\": 0.2751900409341744,\n \"acc_norm_stderr\": 0.032129932454657235,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.01450904517148729,\n \"mc2\": 0.35027279444600373,\n \"mc2_stderr\": 0.01335009503768823\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.34215017064846415,\n \"acc_stderr\": 0.01386415215917728,\n \"acc_norm\": 0.3651877133105802,\n \"acc_norm_stderr\": 0.014070265519268802\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4563831905994822,\n \"acc_stderr\": 0.004970759774676886,\n \"acc_norm\": 0.6086436964748058,\n \"acc_norm_stderr\": 0.004870563921220623\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.30566037735849055,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.03524068951567447,\n \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.03524068951567447\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.03078373675774565,\n \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.03078373675774565\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.03664666337225256,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.03664666337225256\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111394,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.03129843185743811,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.03129843185743811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.27419354838709675,\n \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.032876667586034886,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.032876667586034886\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626304,\n \"acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626304\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.032396370467357036,\n \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.032396370467357036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204416,\n \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204416\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02934457250063434,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02934457250063434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.28807339449541286,\n \"acc_stderr\": 0.01941644589263603,\n \"acc_norm\": 0.28807339449541286,\n \"acc_norm_stderr\": 0.01941644589263603\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997866,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997866\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425173,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425173\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395594,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395594\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n \"acc_stderr\": 0.03006958487449403,\n \"acc_norm\": 0.27802690582959644,\n \"acc_norm_stderr\": 0.03006958487449403\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n \"acc_stderr\": 0.016117318166832283,\n \"acc_norm\": 0.2835249042145594,\n \"acc_norm_stderr\": 0.016117318166832283\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532337,\n \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532337\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958154,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958154\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n \"acc_stderr\": 0.025218040373410626,\n \"acc_norm\": 0.27009646302250806,\n \"acc_norm_stderr\": 0.025218040373410626\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451166,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451166\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.22816166883963493,\n \"acc_stderr\": 0.01071799219204788,\n \"acc_norm\": 0.22816166883963493,\n \"acc_norm_stderr\": 0.01071799219204788\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.028245687391462916,\n \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.028245687391462916\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.01798661530403031,\n \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.01798661530403031\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.02939360931987981,\n \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.02939360931987981\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403325,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403325\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.01450904517148729,\n \"mc2\": 0.35027279444600373,\n \"mc2_stderr\": 0.01335009503768823\n }\n}\n```", "repo_url": "https://huggingface.co/klosax/open_llama_3b_350bt_preview", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:25:13.548749.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:25:13.548749.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T10_25_13.548749", "path": ["results_2023-07-24T10:25:13.548749.parquet"]}, {"split": "latest", "path": ["results_2023-07-24T10:25:13.548749.parquet"]}]}]}
2023-08-27T11:34:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of klosax/open_llama_3b_350bt_preview ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model klosax/open_llama_3b_350bt_preview on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-24T10:25:13.548749 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of klosax/open_llama_3b_350bt_preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_3b_350bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T10:25:13.548749 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of klosax/open_llama_3b_350bt_preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_3b_350bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T10:25:13.548749 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of klosax/open_llama_3b_350bt_preview## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model klosax/open_llama_3b_350bt_preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-24T10:25:13.548749 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0a38b1d6ccfedaa82f011d381af09c53fc3bb87f
# Dataset Card for Evaluation run of xDAN-AI/xDAN_13b_l2_lora ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xDAN-AI/xDAN_13b_l2_lora - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xDAN-AI/xDAN_13b_l2_lora](https://huggingface.co/xDAN-AI/xDAN_13b_l2_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xDAN-AI__xDAN_13b_l2_lora", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-26T14:52:48.502405](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN-AI__xDAN_13b_l2_lora/blob/main/results_2023-07-26T14%3A52%3A48.502405.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5614989942866122, "acc_stderr": 0.034331003794690465, "acc_norm": 0.5656785190124449, "acc_norm_stderr": 0.03430930050159532, "mc1": 0.31946144430844553, "mc1_stderr": 0.016322644182960498, "mc2": 0.44746680649420667, "mc2_stderr": 0.01496374462169886 }, "harness|arc:challenge|25": { "acc": 0.5691126279863481, "acc_stderr": 0.01447113339264247, "acc_norm": 0.6100682593856656, "acc_norm_stderr": 0.014252959848892889 }, "harness|hellaswag|10": { "acc": 0.6207926707827126, "acc_stderr": 0.004841981973515282, "acc_norm": 0.8264289982075284, "acc_norm_stderr": 0.0037796612246514746 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.04026097083296564, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.04026097083296564 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6226415094339622, "acc_stderr": 0.029832808114796005, "acc_norm": 0.6226415094339622, "acc_norm_stderr": 0.029832808114796005 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6180555555555556, "acc_stderr": 0.040629907841466674, "acc_norm": 0.6180555555555556, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4913294797687861, "acc_stderr": 0.03811890988940412, "acc_norm": 0.4913294797687861, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.043898699568087764, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.043898699568087764 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.425531914893617, "acc_stderr": 0.03232146916224468, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3508771929824561, "acc_stderr": 0.044895393502707, "acc_norm": 0.3508771929824561, "acc_norm_stderr": 0.044895393502707 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30687830687830686, "acc_stderr": 0.023752928712112143, "acc_norm": 0.30687830687830686, "acc_norm_stderr": 0.023752928712112143 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.026923446059302844, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.026923446059302844 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.46798029556650245, "acc_stderr": 0.03510766597959217, "acc_norm": 0.46798029556650245, "acc_norm_stderr": 0.03510766597959217 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.0364620496325381, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.0364620496325381 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7171717171717171, "acc_stderr": 0.032087795587867514, "acc_norm": 0.7171717171717171, "acc_norm_stderr": 0.032087795587867514 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7927461139896373, "acc_stderr": 0.02925282329180363, "acc_norm": 0.7927461139896373, "acc_norm_stderr": 0.02925282329180363 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.49743589743589745, "acc_stderr": 0.025350672979412195, "acc_norm": 0.49743589743589745, "acc_norm_stderr": 0.025350672979412195 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2851851851851852, "acc_stderr": 0.027528599210340496, "acc_norm": 0.2851851851851852, "acc_norm_stderr": 0.027528599210340496 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5840336134453782, "acc_stderr": 0.032016501007396114, "acc_norm": 0.5840336134453782, "acc_norm_stderr": 0.032016501007396114 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2781456953642384, "acc_stderr": 0.03658603262763743, "acc_norm": 0.2781456953642384, "acc_norm_stderr": 0.03658603262763743 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7559633027522936, "acc_stderr": 0.018415286351416416, "acc_norm": 0.7559633027522936, "acc_norm_stderr": 0.018415286351416416 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4212962962962963, "acc_stderr": 0.03367462138896079, "acc_norm": 0.4212962962962963, "acc_norm_stderr": 0.03367462138896079 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.029331162294251735, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.029331162294251735 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.0283046579430353, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.0283046579430353 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009224, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009224 }, "harness|hendrycksTest-international_law|5": { "acc": 0.71900826446281, "acc_stderr": 0.04103203830514512, "acc_norm": 0.71900826446281, "acc_norm_stderr": 0.04103203830514512 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664743, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664743 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.33035714285714285, "acc_stderr": 0.044642857142857144, "acc_norm": 0.33035714285714285, "acc_norm_stderr": 0.044642857142857144 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7905982905982906, "acc_stderr": 0.026655699653922737, "acc_norm": 0.7905982905982906, "acc_norm_stderr": 0.026655699653922737 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.04960449637488583, "acc_norm": 0.58, "acc_norm_stderr": 0.04960449637488583 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7675606641123882, "acc_stderr": 0.015104550008905723, "acc_norm": 0.7675606641123882, "acc_norm_stderr": 0.015104550008905723 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6473988439306358, "acc_stderr": 0.025722802200895806, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.025722802200895806 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.44692737430167595, "acc_stderr": 0.016628030039647614, "acc_norm": 0.44692737430167595, "acc_norm_stderr": 0.016628030039647614 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6274509803921569, "acc_stderr": 0.027684181883302895, "acc_norm": 0.6274509803921569, "acc_norm_stderr": 0.027684181883302895 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6302250803858521, "acc_stderr": 0.02741799670563099, "acc_norm": 0.6302250803858521, "acc_norm_stderr": 0.02741799670563099 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6234567901234568, "acc_stderr": 0.02695934451874778, "acc_norm": 0.6234567901234568, "acc_norm_stderr": 0.02695934451874778 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.41843971631205673, "acc_stderr": 0.02942799403941999, "acc_norm": 0.41843971631205673, "acc_norm_stderr": 0.02942799403941999 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.40352020860495436, "acc_stderr": 0.012530241301193182, "acc_norm": 0.40352020860495436, "acc_norm_stderr": 0.012530241301193182 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.49264705882352944, "acc_stderr": 0.030369552523902173, "acc_norm": 0.49264705882352944, "acc_norm_stderr": 0.030369552523902173 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5522875816993464, "acc_stderr": 0.02011692534742242, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.02011692534742242 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670239, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670239 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6571428571428571, "acc_stderr": 0.030387262919547728, "acc_norm": 0.6571428571428571, "acc_norm_stderr": 0.030387262919547728 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7412935323383084, "acc_stderr": 0.03096590312357302, "acc_norm": 0.7412935323383084, "acc_norm_stderr": 0.03096590312357302 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.45180722891566266, "acc_stderr": 0.03874371556587953, "acc_norm": 0.45180722891566266, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7660818713450293, "acc_stderr": 0.03246721765117826, "acc_norm": 0.7660818713450293, "acc_norm_stderr": 0.03246721765117826 }, "harness|truthfulqa:mc|0": { "mc1": 0.31946144430844553, "mc1_stderr": 0.016322644182960498, "mc2": 0.44746680649420667, "mc2_stderr": 0.01496374462169886 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xDAN-AI__xDAN_13b_l2_lora
[ "region:us" ]
2023-08-18T10:31:32+00:00
{"pretty_name": "Evaluation run of xDAN-AI/xDAN_13b_l2_lora", "dataset_summary": "Dataset automatically created during the evaluation run of model [xDAN-AI/xDAN_13b_l2_lora](https://huggingface.co/xDAN-AI/xDAN_13b_l2_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xDAN-AI__xDAN_13b_l2_lora\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-26T14:52:48.502405](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN-AI__xDAN_13b_l2_lora/blob/main/results_2023-07-26T14%3A52%3A48.502405.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5614989942866122,\n \"acc_stderr\": 0.034331003794690465,\n \"acc_norm\": 0.5656785190124449,\n \"acc_norm_stderr\": 0.03430930050159532,\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.44746680649420667,\n \"mc2_stderr\": 0.01496374462169886\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.01447113339264247,\n \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892889\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6207926707827126,\n \"acc_stderr\": 0.004841981973515282,\n \"acc_norm\": 0.8264289982075284,\n \"acc_norm_stderr\": 0.0037796612246514746\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112143,\n \"acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7171717171717171,\n \"acc_stderr\": 0.032087795587867514,\n \"acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.032087795587867514\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412195,\n \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412195\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416416,\n \"acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416416\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.044642857142857144,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.044642857142857144\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n \"acc_stderr\": 0.015104550008905723,\n \"acc_norm\": 0.7675606641123882,\n \"acc_norm_stderr\": 0.015104550008905723\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44692737430167595,\n \"acc_stderr\": 0.016628030039647614,\n \"acc_norm\": 0.44692737430167595,\n \"acc_norm_stderr\": 0.016628030039647614\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n \"acc_stderr\": 0.012530241301193182,\n \"acc_norm\": 0.40352020860495436,\n \"acc_norm_stderr\": 0.012530241301193182\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02011692534742242,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02011692534742242\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.44746680649420667,\n \"mc2_stderr\": 0.01496374462169886\n }\n}\n```", "repo_url": "https://huggingface.co/xDAN-AI/xDAN_13b_l2_lora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:52:48.502405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:52:48.502405.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_26T14_52_48.502405", "path": ["results_2023-07-26T14:52:48.502405.parquet"]}, {"split": "latest", "path": ["results_2023-07-26T14:52:48.502405.parquet"]}]}]}
2023-08-27T11:34:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xDAN-AI/xDAN_13b_l2_lora ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xDAN-AI/xDAN_13b_l2_lora on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-26T14:52:48.502405 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xDAN-AI/xDAN_13b_l2_lora", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xDAN-AI/xDAN_13b_l2_lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-26T14:52:48.502405 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xDAN-AI/xDAN_13b_l2_lora", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xDAN-AI/xDAN_13b_l2_lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-26T14:52:48.502405 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xDAN-AI/xDAN_13b_l2_lora## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xDAN-AI/xDAN_13b_l2_lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-26T14:52:48.502405 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5a71a5a4bff8eb6f4a863af009e4dc01426db3af
# Dataset Card for Evaluation run of ai-forever/rugpt3large_based_on_gpt2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ai-forever/rugpt3large_based_on_gpt2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ai-forever/rugpt3large_based_on_gpt2](https://huggingface.co/ai-forever/rugpt3large_based_on_gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ai-forever__rugpt3large_based_on_gpt2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T14:21:57.108633](https://huggingface.co/datasets/open-llm-leaderboard/details_ai-forever__rugpt3large_based_on_gpt2/blob/main/results_2023-10-28T14-21-57.108633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054710031, "f1": 0.04718854865771828, "f1_stderr": 0.0012961033721750263, "acc": 0.26710430338450897, "acc_stderr": 0.007769858100932027 }, "harness|drop|3": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054710031, "f1": 0.04718854865771828, "f1_stderr": 0.0012961033721750263 }, "harness|gsm8k|5": { "acc": 0.003032600454890068, "acc_stderr": 0.0015145735612245401 }, "harness|winogrande|5": { "acc": 0.5311760063141279, "acc_stderr": 0.014025142640639513 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ai-forever__rugpt3large_based_on_gpt2
[ "region:us" ]
2023-08-18T10:31:41+00:00
{"pretty_name": "Evaluation run of ai-forever/rugpt3large_based_on_gpt2", "dataset_summary": "Dataset automatically created during the evaluation run of model [ai-forever/rugpt3large_based_on_gpt2](https://huggingface.co/ai-forever/rugpt3large_based_on_gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ai-forever__rugpt3large_based_on_gpt2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T14:21:57.108633](https://huggingface.co/datasets/open-llm-leaderboard/details_ai-forever__rugpt3large_based_on_gpt2/blob/main/results_2023-10-28T14-21-57.108633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710031,\n \"f1\": 0.04718854865771828,\n \"f1_stderr\": 0.0012961033721750263,\n \"acc\": 0.26710430338450897,\n \"acc_stderr\": 0.007769858100932027\n },\n \"harness|drop|3\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710031,\n \"f1\": 0.04718854865771828,\n \"f1_stderr\": 0.0012961033721750263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245401\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5311760063141279,\n \"acc_stderr\": 0.014025142640639513\n }\n}\n```", "repo_url": "https://huggingface.co/ai-forever/rugpt3large_based_on_gpt2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T14_21_57.108633", "path": ["**/details_harness|drop|3_2023-10-28T14-21-57.108633.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T14-21-57.108633.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T14_21_57.108633", "path": ["**/details_harness|gsm8k|5_2023-10-28T14-21-57.108633.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T14-21-57.108633.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:06:47.872476.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:06:47.872476.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:06:47.872476.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T14_21_57.108633", "path": ["**/details_harness|winogrande|5_2023-10-28T14-21-57.108633.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T14-21-57.108633.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T11_06_47.872476", "path": ["results_2023-07-19T11:06:47.872476.parquet"]}, {"split": "2023_10_28T14_21_57.108633", "path": ["results_2023-10-28T14-21-57.108633.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T14-21-57.108633.parquet"]}]}]}
2023-10-28T13:22:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ai-forever/rugpt3large_based_on_gpt2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ai-forever/rugpt3large_based_on_gpt2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T14:21:57.108633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ai-forever/rugpt3large_based_on_gpt2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ai-forever/rugpt3large_based_on_gpt2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T14:21:57.108633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ai-forever/rugpt3large_based_on_gpt2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ai-forever/rugpt3large_based_on_gpt2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T14:21:57.108633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ai-forever/rugpt3large_based_on_gpt2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ai-forever/rugpt3large_based_on_gpt2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T14:21:57.108633(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3b21f8afa5e6f6d494442b4f79049d67d7cfca68
# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openaccess-ai-collective/manticore-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openaccess-ai-collective/manticore-13b](https://huggingface.co/openaccess-ai-collective/manticore-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T08:25:31.572792](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b/blob/main/results_2023-09-17T08-25-31.572792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.03166946308724832, "em_stderr": 0.0017933779078599364, "f1": 0.0958106124161078, "f1_stderr": 0.002219577920640015, "acc": 0.44421971872451266, "acc_stderr": 0.010454624721475 }, "harness|drop|3": { "em": 0.03166946308724832, "em_stderr": 0.0017933779078599364, "f1": 0.0958106124161078, "f1_stderr": 0.002219577920640015 }, "harness|gsm8k|5": { "acc": 0.12206216830932524, "acc_stderr": 0.009017054965766476 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183524 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b
[ "region:us" ]
2023-08-18T10:31:50+00:00
{"pretty_name": "Evaluation run of openaccess-ai-collective/manticore-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/manticore-13b](https://huggingface.co/openaccess-ai-collective/manticore-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T08:25:31.572792](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-13b/blob/main/results_2023-09-17T08-25-31.572792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03166946308724832,\n \"em_stderr\": 0.0017933779078599364,\n \"f1\": 0.0958106124161078,\n \"f1_stderr\": 0.002219577920640015,\n \"acc\": 0.44421971872451266,\n \"acc_stderr\": 0.010454624721475\n },\n \"harness|drop|3\": {\n \"em\": 0.03166946308724832,\n \"em_stderr\": 0.0017933779078599364,\n \"f1\": 0.0958106124161078,\n \"f1_stderr\": 0.002219577920640015\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12206216830932524,\n \"acc_stderr\": 0.009017054965766476\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/manticore-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T08_25_31.572792", "path": ["**/details_harness|drop|3_2023-09-17T08-25-31.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T08-25-31.572792.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T08_25_31.572792", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-25-31.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-25-31.572792.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:15:19.404064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:15:19.404064.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:15:19.404064.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T08_25_31.572792", "path": ["**/details_harness|winogrande|5_2023-09-17T08-25-31.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T08-25-31.572792.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_15_19.404064", "path": ["results_2023-07-19T19:15:19.404064.parquet"]}, {"split": "2023_09_17T08_25_31.572792", "path": ["results_2023-09-17T08-25-31.572792.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T08-25-31.572792.parquet"]}]}]}
2023-09-17T07:25:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T08:25:31.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T08:25:31.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T08:25:31.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T08:25:31.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
067743be0a9cd67557a424a6ebd4bce84db95b13
# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openaccess-ai-collective/minotaur-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openaccess-ai-collective/minotaur-13b](https://huggingface.co/openaccess-ai-collective/minotaur-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T11:35:33.158218](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b/blob/main/results_2023-09-17T11-35-33.158218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2911073825503356, "em_stderr": 0.004652179762964262, "f1": 0.3533399748322166, "f1_stderr": 0.004582757562230151, "acc": 0.4453413859606396, "acc_stderr": 0.01050936577304381 }, "harness|drop|3": { "em": 0.2911073825503356, "em_stderr": 0.004652179762964262, "f1": 0.3533399748322166, "f1_stderr": 0.004582757562230151 }, "harness|gsm8k|5": { "acc": 0.12509476876421532, "acc_stderr": 0.009112601439849629 }, "harness|winogrande|5": { "acc": 0.7655880031570639, "acc_stderr": 0.011906130106237988 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b
[ "region:us" ]
2023-08-18T10:31:59+00:00
{"pretty_name": "Evaluation run of openaccess-ai-collective/minotaur-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/minotaur-13b](https://huggingface.co/openaccess-ai-collective/minotaur-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T11:35:33.158218](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b/blob/main/results_2023-09-17T11-35-33.158218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2911073825503356,\n \"em_stderr\": 0.004652179762964262,\n \"f1\": 0.3533399748322166,\n \"f1_stderr\": 0.004582757562230151,\n \"acc\": 0.4453413859606396,\n \"acc_stderr\": 0.01050936577304381\n },\n \"harness|drop|3\": {\n \"em\": 0.2911073825503356,\n \"em_stderr\": 0.004652179762964262,\n \"f1\": 0.3533399748322166,\n \"f1_stderr\": 0.004582757562230151\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12509476876421532,\n \"acc_stderr\": 0.009112601439849629\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237988\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/minotaur-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T11_35_33.158218", "path": ["**/details_harness|drop|3_2023-09-17T11-35-33.158218.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T11-35-33.158218.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T11_35_33.158218", "path": ["**/details_harness|gsm8k|5_2023-09-17T11-35-33.158218.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T11-35-33.158218.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:52.077510.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:13:52.077510.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:13:52.077510.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T11_35_33.158218", "path": ["**/details_harness|winogrande|5_2023-09-17T11-35-33.158218.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T11-35-33.158218.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_13_52.077510", "path": ["results_2023-07-19T19:13:52.077510.parquet"]}, {"split": "2023_09_17T11_35_33.158218", "path": ["results_2023-09-17T11-35-33.158218.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T11-35-33.158218.parquet"]}]}]}
2023-09-17T10:35:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T11:35:33.158218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T11:35:33.158218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T11:35:33.158218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T11:35:33.158218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c113b55b47dabdba7d56f024c556a0f0112fd170
# Dataset Card for Evaluation run of openaccess-ai-collective/hippogriff-30b-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openaccess-ai-collective/hippogriff-30b-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openaccess-ai-collective/hippogriff-30b-chat](https://huggingface.co/openaccess-ai-collective/hippogriff-30b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__hippogriff-30b-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T17:51:49.763366](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__hippogriff-30b-chat/blob/main/results_2023-09-17T17-51-49.763366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.40625, "em_stderr": 0.005029654457747759, "f1": 0.45993603187919596, "f1_stderr": 0.004852091061102259, "acc": 0.45527931578597786, "acc_stderr": 0.009707160330434178 }, "harness|drop|3": { "em": 0.40625, "em_stderr": 0.005029654457747759, "f1": 0.45993603187919596, "f1_stderr": 0.004852091061102259 }, "harness|gsm8k|5": { "acc": 0.1023502653525398, "acc_stderr": 0.008349110996208829 }, "harness|winogrande|5": { "acc": 0.8082083662194159, "acc_stderr": 0.011065209664659527 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openaccess-ai-collective__hippogriff-30b-chat
[ "region:us" ]
2023-08-18T10:32:08+00:00
{"pretty_name": "Evaluation run of openaccess-ai-collective/hippogriff-30b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/hippogriff-30b-chat](https://huggingface.co/openaccess-ai-collective/hippogriff-30b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__hippogriff-30b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T17:51:49.763366](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__hippogriff-30b-chat/blob/main/results_2023-09-17T17-51-49.763366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.40625,\n \"em_stderr\": 0.005029654457747759,\n \"f1\": 0.45993603187919596,\n \"f1_stderr\": 0.004852091061102259,\n \"acc\": 0.45527931578597786,\n \"acc_stderr\": 0.009707160330434178\n },\n \"harness|drop|3\": {\n \"em\": 0.40625,\n \"em_stderr\": 0.005029654457747759,\n \"f1\": 0.45993603187919596,\n \"f1_stderr\": 0.004852091061102259\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1023502653525398,\n \"acc_stderr\": 0.008349110996208829\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/hippogriff-30b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T17_51_49.763366", "path": ["**/details_harness|drop|3_2023-09-17T17-51-49.763366.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T17-51-49.763366.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T17_51_49.763366", "path": ["**/details_harness|gsm8k|5_2023-09-17T17-51-49.763366.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T17-51-49.763366.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:55:46.065027.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:55:46.065027.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:55:46.065027.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T17_51_49.763366", "path": ["**/details_harness|winogrande|5_2023-09-17T17-51-49.763366.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T17-51-49.763366.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_55_46.065027", "path": ["results_2023-07-19T22:55:46.065027.parquet"]}, {"split": "2023_09_17T17_51_49.763366", "path": ["results_2023-09-17T17-51-49.763366.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T17-51-49.763366.parquet"]}]}]}
2023-09-17T16:52:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openaccess-ai-collective/hippogriff-30b-chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openaccess-ai-collective/hippogriff-30b-chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T17:51:49.763366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openaccess-ai-collective/hippogriff-30b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/hippogriff-30b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T17:51:49.763366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openaccess-ai-collective/hippogriff-30b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/hippogriff-30b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T17:51:49.763366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/hippogriff-30b-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/hippogriff-30b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T17:51:49.763366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6493c1e22ea0de27f67335411d8c34bb5d632dbd
# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openaccess-ai-collective/manticore-30b-chat-pyg-alpha](https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T10:46:00.243267](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha/blob/main/results_2023-10-17T10-46-00.243267.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.26981963087248323, "em_stderr": 0.004545602964433134, "f1": 0.33224203020134285, "f1_stderr": 0.004467686561611976, "acc": 0.47775933431188755, "acc_stderr": 0.010733512146749623 }, "harness|drop|3": { "em": 0.26981963087248323, "em_stderr": 0.004545602964433134, "f1": 0.33224203020134285, "f1_stderr": 0.004467686561611976 }, "harness|gsm8k|5": { "acc": 0.1607278241091736, "acc_stderr": 0.010116708586037183 }, "harness|winogrande|5": { "acc": 0.7947908445146015, "acc_stderr": 0.011350315707462064 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha
[ "region:us" ]
2023-08-18T10:32:17+00:00
{"pretty_name": "Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/manticore-30b-chat-pyg-alpha](https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T10:46:00.243267](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha/blob/main/results_2023-10-17T10-46-00.243267.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.26981963087248323,\n \"em_stderr\": 0.004545602964433134,\n \"f1\": 0.33224203020134285,\n \"f1_stderr\": 0.004467686561611976,\n \"acc\": 0.47775933431188755,\n \"acc_stderr\": 0.010733512146749623\n },\n \"harness|drop|3\": {\n \"em\": 0.26981963087248323,\n \"em_stderr\": 0.004545602964433134,\n \"f1\": 0.33224203020134285,\n \"f1_stderr\": 0.004467686561611976\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1607278241091736,\n \"acc_stderr\": 0.010116708586037183\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462064\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T10_46_00.243267", "path": ["**/details_harness|drop|3_2023-10-17T10-46-00.243267.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T10-46-00.243267.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T10_46_00.243267", "path": ["**/details_harness|gsm8k|5_2023-10-17T10-46-00.243267.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T10-46-00.243267.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:51:00.483071.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:51:00.483071.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T10_46_00.243267", "path": ["**/details_harness|winogrande|5_2023-10-17T10-46-00.243267.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T10-46-00.243267.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_51_00.483071", "path": ["results_2023-07-19T22:51:00.483071.parquet"]}, {"split": "2023_10_17T10_46_00.243267", "path": ["results_2023-10-17T10-46-00.243267.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T10-46-00.243267.parquet"]}]}]}
2023-10-17T09:46:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-30b-chat-pyg-alpha on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T10:46:00.243267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-30b-chat-pyg-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T10:46:00.243267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-30b-chat-pyg-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T10:46:00.243267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/manticore-30b-chat-pyg-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T10:46:00.243267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e1df91f9ddf1f3731fdd64c2a685f40e597598f2
# Dataset Card for Evaluation run of openaccess-ai-collective/wizard-mega-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openaccess-ai-collective/wizard-mega-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openaccess-ai-collective/wizard-mega-13b](https://huggingface.co/openaccess-ai-collective/wizard-mega-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__wizard-mega-13b_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-06T19:41:58.229759](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__wizard-mega-13b_public/blob/main/results_2023-11-06T19-41-58.229759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0028313758389261743, "em_stderr": 0.0005441551135493785, "f1": 0.06436975671140957, "f1_stderr": 0.0014308039558345285, "acc": 0.4320270851671251, "acc_stderr": 0.010120811778666996 }, "harness|drop|3": { "em": 0.0028313758389261743, "em_stderr": 0.0005441551135493785, "f1": 0.06436975671140957, "f1_stderr": 0.0014308039558345285 }, "harness|gsm8k|5": { "acc": 0.10083396512509477, "acc_stderr": 0.008294031192126593 }, "harness|winogrande|5": { "acc": 0.7632202052091555, "acc_stderr": 0.011947592365207399 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openaccess-ai-collective__wizard-mega-13b
[ "region:us" ]
2023-08-18T10:32:26+00:00
{"pretty_name": "Evaluation run of openaccess-ai-collective/wizard-mega-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/wizard-mega-13b](https://huggingface.co/openaccess-ai-collective/wizard-mega-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__wizard-mega-13b_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-06T19:41:58.229759](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__wizard-mega-13b_public/blob/main/results_2023-11-06T19-41-58.229759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135493785,\n \"f1\": 0.06436975671140957,\n \"f1_stderr\": 0.0014308039558345285,\n \"acc\": 0.4320270851671251,\n \"acc_stderr\": 0.010120811778666996\n },\n \"harness|drop|3\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135493785,\n \"f1\": 0.06436975671140957,\n \"f1_stderr\": 0.0014308039558345285\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10083396512509477,\n \"acc_stderr\": 0.008294031192126593\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207399\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/wizard-mega-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_05T01_45_36.376645", "path": ["**/details_harness|drop|3_2023-11-05T01-45-36.376645.parquet"]}, {"split": "2023_11_06T19_41_58.229759", "path": ["**/details_harness|drop|3_2023-11-06T19-41-58.229759.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-06T19-41-58.229759.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_05T01_45_36.376645", "path": ["**/details_harness|gsm8k|5_2023-11-05T01-45-36.376645.parquet"]}, {"split": "2023_11_06T19_41_58.229759", "path": ["**/details_harness|gsm8k|5_2023-11-06T19-41-58.229759.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-06T19-41-58.229759.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_05T01_45_36.376645", "path": ["**/details_harness|winogrande|5_2023-11-05T01-45-36.376645.parquet"]}, {"split": "2023_11_06T19_41_58.229759", "path": ["**/details_harness|winogrande|5_2023-11-06T19-41-58.229759.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-06T19-41-58.229759.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_05T01_45_36.376645", "path": ["results_2023-11-05T01-45-36.376645.parquet"]}, {"split": "2023_11_06T19_41_58.229759", "path": ["results_2023-11-06T19-41-58.229759.parquet"]}, {"split": "latest", "path": ["results_2023-11-06T19-41-58.229759.parquet"]}]}]}
2023-12-01T14:14:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openaccess-ai-collective/wizard-mega-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openaccess-ai-collective/wizard-mega-13b on the Open LLM Leaderboard. The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-11-06T19:41:58.229759(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openaccess-ai-collective/wizard-mega-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/wizard-mega-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-06T19:41:58.229759(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openaccess-ai-collective/wizard-mega-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/wizard-mega-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-06T19:41:58.229759(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/wizard-mega-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/wizard-mega-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-06T19:41:58.229759(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
921e9879548e311189a710f5a6c2a1a8ff2501ec
# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b-fixed ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openaccess-ai-collective/minotaur-13b-fixed - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openaccess-ai-collective/minotaur-13b-fixed](https://huggingface.co/openaccess-ai-collective/minotaur-13b-fixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b-fixed", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T11:35:58.500746](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b-fixed/blob/main/results_2023-09-17T11-35-58.500746.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.08588506711409397, "em_stderr": 0.0028694514614574086, "f1": 0.15832529362416004, "f1_stderr": 0.0031260511951114243, "acc": 0.44995251838080197, "acc_stderr": 0.01057426968021918 }, "harness|drop|3": { "em": 0.08588506711409397, "em_stderr": 0.0028694514614574086, "f1": 0.15832529362416004, "f1_stderr": 0.0031260511951114243 }, "harness|gsm8k|5": { "acc": 0.13115996967399546, "acc_stderr": 0.009298499235587853 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.01185004012485051 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b-fixed
[ "region:us" ]
2023-08-18T10:32:35+00:00
{"pretty_name": "Evaluation run of openaccess-ai-collective/minotaur-13b-fixed", "dataset_summary": "Dataset automatically created during the evaluation run of model [openaccess-ai-collective/minotaur-13b-fixed](https://huggingface.co/openaccess-ai-collective/minotaur-13b-fixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b-fixed\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T11:35:58.500746](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__minotaur-13b-fixed/blob/main/results_2023-09-17T11-35-58.500746.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08588506711409397,\n \"em_stderr\": 0.0028694514614574086,\n \"f1\": 0.15832529362416004,\n \"f1_stderr\": 0.0031260511951114243,\n \"acc\": 0.44995251838080197,\n \"acc_stderr\": 0.01057426968021918\n },\n \"harness|drop|3\": {\n \"em\": 0.08588506711409397,\n \"em_stderr\": 0.0028694514614574086,\n \"f1\": 0.15832529362416004,\n \"f1_stderr\": 0.0031260511951114243\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \"acc_stderr\": 0.009298499235587853\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n }\n}\n```", "repo_url": "https://huggingface.co/openaccess-ai-collective/minotaur-13b-fixed", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|arc:challenge|25_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T11_35_58.500746", "path": ["**/details_harness|drop|3_2023-09-17T11-35-58.500746.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T11-35-58.500746.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T11_35_58.500746", "path": ["**/details_harness|gsm8k|5_2023-09-17T11-35-58.500746.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T11-35-58.500746.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hellaswag|10_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T12:56:58.097671.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T12:56:58.097671.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T12:56:58.097671.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T11_35_58.500746", "path": ["**/details_harness|winogrande|5_2023-09-17T11-35-58.500746.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T11-35-58.500746.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T12_56_58.097671", "path": ["results_2023-07-24T12:56:58.097671.parquet"]}, {"split": "2023_09_17T11_35_58.500746", "path": ["results_2023-09-17T11-35-58.500746.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T11-35-58.500746.parquet"]}]}]}
2023-09-17T10:36:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b-fixed ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b-fixed on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T11:35:58.500746(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b-fixed", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b-fixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T11:35:58.500746(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b-fixed", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b-fixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T11:35:58.500746(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openaccess-ai-collective/minotaur-13b-fixed## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openaccess-ai-collective/minotaur-13b-fixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T11:35:58.500746(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6e0f08226cb803ef7082cfa5dd978025bf0a7bb4
# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/bofenghuang/vigogne-13b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-13b-instruct](https://huggingface.co/bofenghuang/vigogne-13b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-13b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T12:07:22.235368](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-13b-instruct/blob/main/results_2023-09-23T12-07-22.235368.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.17365771812080538, "em_stderr": 0.003879418958892452, "f1": 0.24381711409395945, "f1_stderr": 0.003948980378514697, "acc": 0.44469214138811486, "acc_stderr": 0.010351218038230168 }, "harness|drop|3": { "em": 0.17365771812080538, "em_stderr": 0.003879418958892452, "f1": 0.24381711409395945, "f1_stderr": 0.003948980378514697 }, "harness|gsm8k|5": { "acc": 0.11827141774071266, "acc_stderr": 0.008895075852434951 }, "harness|winogrande|5": { "acc": 0.771112865035517, "acc_stderr": 0.011807360224025386 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_bofenghuang__vigogne-13b-instruct
[ "region:us" ]
2023-08-18T10:32:44+00:00
{"pretty_name": "Evaluation run of bofenghuang/vigogne-13b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-13b-instruct](https://huggingface.co/bofenghuang/vigogne-13b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-13b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T12:07:22.235368](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-13b-instruct/blob/main/results_2023-09-23T12-07-22.235368.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17365771812080538,\n \"em_stderr\": 0.003879418958892452,\n \"f1\": 0.24381711409395945,\n \"f1_stderr\": 0.003948980378514697,\n \"acc\": 0.44469214138811486,\n \"acc_stderr\": 0.010351218038230168\n },\n \"harness|drop|3\": {\n \"em\": 0.17365771812080538,\n \"em_stderr\": 0.003879418958892452,\n \"f1\": 0.24381711409395945,\n \"f1_stderr\": 0.003948980378514697\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11827141774071266,\n \"acc_stderr\": 0.008895075852434951\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025386\n }\n}\n```", "repo_url": "https://huggingface.co/bofenghuang/vigogne-13b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T12_07_22.235368", "path": ["**/details_harness|drop|3_2023-09-23T12-07-22.235368.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T12-07-22.235368.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T12_07_22.235368", "path": ["**/details_harness|gsm8k|5_2023-09-23T12-07-22.235368.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T12-07-22.235368.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:44:50.658520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:44:50.658520.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:44:50.658520.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T12_07_22.235368", "path": ["**/details_harness|winogrande|5_2023-09-23T12-07-22.235368.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T12-07-22.235368.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T14_44_50.658520", "path": ["results_2023-07-24T14:44:50.658520.parquet"]}, {"split": "2023_09_23T12_07_22.235368", "path": ["results_2023-09-23T12-07-22.235368.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T12-07-22.235368.parquet"]}]}]}
2023-09-23T11:07:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T12:07:22.235368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T12:07:22.235368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T12:07:22.235368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T12:07:22.235368(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a1da7492286a51b10c6cbd430a91e57d1735d3fb
# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/bofenghuang/vigogne-13b-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-13b-chat](https://huggingface.co/bofenghuang/vigogne-13b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-13b-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T17:13:01.877874](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-13b-chat/blob/main/results_2023-09-22T17-13-01.877874.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.28093540268456374, "em_stderr": 0.004602850194300876, "f1": 0.3580620805369141, "f1_stderr": 0.0045122087324351344, "acc": 0.42528152381590656, "acc_stderr": 0.009746925675481622 }, "harness|drop|3": { "em": 0.28093540268456374, "em_stderr": 0.004602850194300876, "f1": 0.3580620805369141, "f1_stderr": 0.0045122087324351344 }, "harness|gsm8k|5": { "acc": 0.08339651250947688, "acc_stderr": 0.007615650277106699 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.011878201073856544 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_bofenghuang__vigogne-13b-chat
[ "region:us" ]
2023-08-18T10:32:52+00:00
{"pretty_name": "Evaluation run of bofenghuang/vigogne-13b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-13b-chat](https://huggingface.co/bofenghuang/vigogne-13b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-13b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T17:13:01.877874](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-13b-chat/blob/main/results_2023-09-22T17-13-01.877874.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.28093540268456374,\n \"em_stderr\": 0.004602850194300876,\n \"f1\": 0.3580620805369141,\n \"f1_stderr\": 0.0045122087324351344,\n \"acc\": 0.42528152381590656,\n \"acc_stderr\": 0.009746925675481622\n },\n \"harness|drop|3\": {\n \"em\": 0.28093540268456374,\n \"em_stderr\": 0.004602850194300876,\n \"f1\": 0.3580620805369141,\n \"f1_stderr\": 0.0045122087324351344\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \"acc_stderr\": 0.007615650277106699\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n }\n}\n```", "repo_url": "https://huggingface.co/bofenghuang/vigogne-13b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|arc:challenge|25_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T17_13_01.877874", "path": ["**/details_harness|drop|3_2023-09-22T17-13-01.877874.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T17-13-01.877874.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T17_13_01.877874", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-13-01.877874.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-13-01.877874.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hellaswag|10_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T11:18:55.409320.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T11:18:55.409320.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T11:18:55.409320.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T17_13_01.877874", "path": ["**/details_harness|winogrande|5_2023-09-22T17-13-01.877874.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T17-13-01.877874.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T11_18_55.409320", "path": ["results_2023-07-25T11:18:55.409320.parquet"]}, {"split": "2023_09_22T17_13_01.877874", "path": ["results_2023-09-22T17-13-01.877874.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T17-13-01.877874.parquet"]}]}]}
2023-09-22T16:13:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T17:13:01.877874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:13:01.877874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:13:01.877874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bofenghuang/vigogne-13b-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-13b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T17:13:01.877874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
15fb7bba501c07dae9a2f1f82189be16c7a22714
# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/bofenghuang/vigogne-2-7b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-2-7b-instruct](https://huggingface.co/bofenghuang/vigogne-2-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T08:45:31.930950](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct/blob/main/results_2023-09-23T08-45-31.930950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2985528523489933, "em_stderr": 0.0046864904941642995, "f1": 0.3518403942953031, "f1_stderr": 0.004613402461586294, "acc": 0.39622289254314186, "acc_stderr": 0.008677803422491042 }, "harness|drop|3": { "em": 0.2985528523489933, "em_stderr": 0.0046864904941642995, "f1": 0.3518403942953031, "f1_stderr": 0.004613402461586294 }, "harness|gsm8k|5": { "acc": 0.03790750568612585, "acc_stderr": 0.005260333907798437 }, "harness|winogrande|5": { "acc": 0.7545382794001578, "acc_stderr": 0.012095272937183646 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct
[ "region:us" ]
2023-08-18T10:33:01+00:00
{"pretty_name": "Evaluation run of bofenghuang/vigogne-2-7b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-2-7b-instruct](https://huggingface.co/bofenghuang/vigogne-2-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T08:45:31.930950](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct/blob/main/results_2023-09-23T08-45-31.930950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2985528523489933,\n \"em_stderr\": 0.0046864904941642995,\n \"f1\": 0.3518403942953031,\n \"f1_stderr\": 0.004613402461586294,\n \"acc\": 0.39622289254314186,\n \"acc_stderr\": 0.008677803422491042\n },\n \"harness|drop|3\": {\n \"em\": 0.2985528523489933,\n \"em_stderr\": 0.0046864904941642995,\n \"f1\": 0.3518403942953031,\n \"f1_stderr\": 0.004613402461586294\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03790750568612585,\n \"acc_stderr\": 0.005260333907798437\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183646\n }\n}\n```", "repo_url": "https://huggingface.co/bofenghuang/vigogne-2-7b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T08_45_31.930950", "path": ["**/details_harness|drop|3_2023-09-23T08-45-31.930950.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T08-45-31.930950.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T08_45_31.930950", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-45-31.930950.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-45-31.930950.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:36:05.447803.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:36:05.447803.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T08_45_31.930950", "path": ["**/details_harness|winogrande|5_2023-09-23T08-45-31.930950.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T08-45-31.930950.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T10_36_05.447803", "path": ["results_2023-07-25T10:36:05.447803.parquet"]}, {"split": "2023_09_23T08_45_31.930950", "path": ["results_2023-09-23T08-45-31.930950.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T08-45-31.930950.parquet"]}]}]}
2023-09-23T07:45:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model bofenghuang/vigogne-2-7b-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T08:45:31.930950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-2-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T08:45:31.930950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-2-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T08:45:31.930950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-2-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T08:45:31.930950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a71a9f3117c8e997fc4881f2dea74866bb019a78
# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/bofenghuang/vigogne-7b-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-7b-chat](https://huggingface.co/bofenghuang/vigogne-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-7b-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T09:53:08.611254](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-7b-chat/blob/main/results_2023-10-15T09-53-08.611254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2670931208053691, "em_stderr": 0.004531013974545822, "f1": 0.31957005033557123, "f1_stderr": 0.004492252212863049, "acc": 0.4037322886379806, "acc_stderr": 0.009872273041361887 }, "harness|drop|3": { "em": 0.2670931208053691, "em_stderr": 0.004531013974545822, "f1": 0.31957005033557123, "f1_stderr": 0.004492252212863049 }, "harness|gsm8k|5": { "acc": 0.0758150113722517, "acc_stderr": 0.007291205723162577 }, "harness|winogrande|5": { "acc": 0.7316495659037096, "acc_stderr": 0.012453340359561195 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_bofenghuang__vigogne-7b-chat
[ "region:us" ]
2023-08-18T10:33:10+00:00
{"pretty_name": "Evaluation run of bofenghuang/vigogne-7b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-7b-chat](https://huggingface.co/bofenghuang/vigogne-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-7b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T09:53:08.611254](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-7b-chat/blob/main/results_2023-10-15T09-53-08.611254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2670931208053691,\n \"em_stderr\": 0.004531013974545822,\n \"f1\": 0.31957005033557123,\n \"f1_stderr\": 0.004492252212863049,\n \"acc\": 0.4037322886379806,\n \"acc_stderr\": 0.009872273041361887\n },\n \"harness|drop|3\": {\n \"em\": 0.2670931208053691,\n \"em_stderr\": 0.004531013974545822,\n \"f1\": 0.31957005033557123,\n \"f1_stderr\": 0.004492252212863049\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \"acc_stderr\": 0.007291205723162577\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n }\n}\n```", "repo_url": "https://huggingface.co/bofenghuang/vigogne-7b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T09_53_08.611254", "path": ["**/details_harness|drop|3_2023-10-15T09-53-08.611254.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T09-53-08.611254.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T09_53_08.611254", "path": ["**/details_harness|gsm8k|5_2023-10-15T09-53-08.611254.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T09-53-08.611254.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:29.962597.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:58:29.962597.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:58:29.962597.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T09_53_08.611254", "path": ["**/details_harness|winogrande|5_2023-10-15T09-53-08.611254.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T09-53-08.611254.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T10_58_29.962597", "path": ["results_2023-07-25T10:58:29.962597.parquet"]}, {"split": "2023_10_15T09_53_08.611254", "path": ["results_2023-10-15T09-53-08.611254.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T09-53-08.611254.parquet"]}]}]}
2023-10-15T08:53:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T09:53:08.611254(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T09:53:08.611254(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T09:53:08.611254(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T09:53:08.611254(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6083ee4cf1e39edba9dbeed81187ffad7cfdd80b
# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/bofenghuang/vigogne-7b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-7b-instruct](https://huggingface.co/bofenghuang/vigogne-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-7b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T01:49:04.568922](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-7b-instruct/blob/main/results_2023-10-15T01-49-04.568922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.205746644295302, "em_stderr": 0.0041398591062581916, "f1": 0.2632434983221478, "f1_stderr": 0.004152783116171037, "acc": 0.3778929530335878, "acc_stderr": 0.008493710817551448 }, "harness|drop|3": { "em": 0.205746644295302, "em_stderr": 0.0041398591062581916, "f1": 0.2632434983221478, "f1_stderr": 0.004152783116171037 }, "harness|gsm8k|5": { "acc": 0.027293404094010616, "acc_stderr": 0.00448809538020977 }, "harness|winogrande|5": { "acc": 0.728492501973165, "acc_stderr": 0.012499326254893127 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_bofenghuang__vigogne-7b-instruct
[ "region:us" ]
2023-08-18T10:33:18+00:00
{"pretty_name": "Evaluation run of bofenghuang/vigogne-7b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-7b-instruct](https://huggingface.co/bofenghuang/vigogne-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-7b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T01:49:04.568922](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-7b-instruct/blob/main/results_2023-10-15T01-49-04.568922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.205746644295302,\n \"em_stderr\": 0.0041398591062581916,\n \"f1\": 0.2632434983221478,\n \"f1_stderr\": 0.004152783116171037,\n \"acc\": 0.3778929530335878,\n \"acc_stderr\": 0.008493710817551448\n },\n \"harness|drop|3\": {\n \"em\": 0.205746644295302,\n \"em_stderr\": 0.0041398591062581916,\n \"f1\": 0.2632434983221478,\n \"f1_stderr\": 0.004152783116171037\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.027293404094010616,\n \"acc_stderr\": 0.00448809538020977\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893127\n }\n}\n```", "repo_url": "https://huggingface.co/bofenghuang/vigogne-7b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|arc:challenge|25_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T01_49_04.568922", "path": ["**/details_harness|drop|3_2023-10-15T01-49-04.568922.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T01-49-04.568922.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T01_49_04.568922", "path": ["**/details_harness|gsm8k|5_2023-10-15T01-49-04.568922.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T01-49-04.568922.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hellaswag|10_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T13:54:54.750661.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T13:54:54.750661.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T13:54:54.750661.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T01_49_04.568922", "path": ["**/details_harness|winogrande|5_2023-10-15T01-49-04.568922.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T01-49-04.568922.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T13_54_54.750661", "path": ["results_2023-07-25T13:54:54.750661.parquet"]}, {"split": "2023_10_15T01_49_04.568922", "path": ["results_2023-10-15T01-49-04.568922.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T01-49-04.568922.parquet"]}]}]}
2023-10-15T00:49:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T01:49:04.568922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T01:49:04.568922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T01:49:04.568922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bofenghuang/vigogne-7b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bofenghuang/vigogne-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T01:49:04.568922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
65a95995fcdfc976b88b65dc6d77929bed0f9ef9
# Dataset Card for Evaluation run of aisquared/dlite-v1-1_5b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v1-1_5b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-1_5b](https://huggingface.co/aisquared/dlite-v1-1_5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v1-1_5b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T17:48:40.273494](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-1_5b/blob/main/results_2023-09-23T17-48-40.273494.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.005977348993288591, "em_stderr": 0.0007893908687131983, "f1": 0.06289953859060417, "f1_stderr": 0.0015069024652225058, "acc": 0.28017386590137583, "acc_stderr": 0.00735524021281907 }, "harness|drop|3": { "em": 0.005977348993288591, "em_stderr": 0.0007893908687131983, "f1": 0.06289953859060417, "f1_stderr": 0.0015069024652225058 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.0007581501137225347 }, "harness|winogrande|5": { "acc": 0.5595895816890292, "acc_stderr": 0.013952330311915607 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v1-1_5b
[ "region:us" ]
2023-08-18T10:33:27+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v1-1_5b", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-1_5b](https://huggingface.co/aisquared/dlite-v1-1_5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v1-1_5b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T17:48:40.273494](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-1_5b/blob/main/results_2023-09-23T17-48-40.273494.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005977348993288591,\n \"em_stderr\": 0.0007893908687131983,\n \"f1\": 0.06289953859060417,\n \"f1_stderr\": 0.0015069024652225058,\n \"acc\": 0.28017386590137583,\n \"acc_stderr\": 0.00735524021281907\n },\n \"harness|drop|3\": {\n \"em\": 0.005977348993288591,\n \"em_stderr\": 0.0007893908687131983,\n \"f1\": 0.06289953859060417,\n \"f1_stderr\": 0.0015069024652225058\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225347\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5595895816890292,\n \"acc_stderr\": 0.013952330311915607\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v1-1_5b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T17_48_40.273494", "path": ["**/details_harness|drop|3_2023-09-23T17-48-40.273494.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T17-48-40.273494.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T17_48_40.273494", "path": ["**/details_harness|gsm8k|5_2023-09-23T17-48-40.273494.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T17-48-40.273494.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:22:45.415057.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:22:45.415057.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:22:45.415057.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T17_48_40.273494", "path": ["**/details_harness|winogrande|5_2023-09-23T17-48-40.273494.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T17-48-40.273494.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_22_45.415057", "path": ["results_2023-07-19T15:22:45.415057.parquet"]}, {"split": "2023_09_23T17_48_40.273494", "path": ["results_2023-09-23T17-48-40.273494.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T17-48-40.273494.parquet"]}]}]}
2023-09-23T16:48:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v1-1_5b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v1-1_5b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T17:48:40.273494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v1-1_5b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-1_5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T17:48:40.273494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v1-1_5b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-1_5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T17:48:40.273494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v1-1_5b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-1_5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T17:48:40.273494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e575d77352d87e8f87b44d3446a9bdaa93660ecc
# Dataset Card for Evaluation run of aisquared/dlite-v2-355m ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v2-355m - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-355m](https://huggingface.co/aisquared/dlite-v2-355m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v2-355m", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T23:07:25.491864](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-355m/blob/main/results_2023-10-15T23-07-25.491864.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001572986577181208, "em_stderr": 0.000405845113241774, "f1": 0.055305159395973226, "f1_stderr": 0.001369522078512369, "acc": 0.26400947119179163, "acc_stderr": 0.007015202106702892 }, "harness|drop|3": { "em": 0.001572986577181208, "em_stderr": 0.000405845113241774, "f1": 0.055305159395973226, "f1_stderr": 0.001369522078512369 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5280189423835833, "acc_stderr": 0.014030404213405784 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v2-355m
[ "region:us" ]
2023-08-18T10:33:35+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v2-355m", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-355m](https://huggingface.co/aisquared/dlite-v2-355m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v2-355m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T23:07:25.491864](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-355m/blob/main/results_2023-10-15T23-07-25.491864.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241774,\n \"f1\": 0.055305159395973226,\n \"f1_stderr\": 0.001369522078512369,\n \"acc\": 0.26400947119179163,\n \"acc_stderr\": 0.007015202106702892\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241774,\n \"f1\": 0.055305159395973226,\n \"f1_stderr\": 0.001369522078512369\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5280189423835833,\n \"acc_stderr\": 0.014030404213405784\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v2-355m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T23_07_25.491864", "path": ["**/details_harness|drop|3_2023-10-15T23-07-25.491864.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T23-07-25.491864.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T23_07_25.491864", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-07-25.491864.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-07-25.491864.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:14:13.332045.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:14:13.332045.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T23_07_25.491864", "path": ["**/details_harness|winogrande|5_2023-10-15T23-07-25.491864.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T23-07-25.491864.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_14_13.332045", "path": ["results_2023-07-19T14:14:13.332045.parquet"]}, {"split": "2023_10_15T23_07_25.491864", "path": ["results_2023-10-15T23-07-25.491864.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T23-07-25.491864.parquet"]}]}]}
2023-10-15T22:08:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v2-355m ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v2-355m on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T23:07:25.491864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v2-355m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-355m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T23:07:25.491864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v2-355m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-355m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T23:07:25.491864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v2-355m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-355m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T23:07:25.491864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f670c40bd4559ff569645e6c6082ea387113dbe5
# Dataset Card for Evaluation run of aisquared/dlite-v2-774m ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v2-774m - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-774m](https://huggingface.co/aisquared/dlite-v2-774m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v2-774m", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T06:47:53.119042](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-774m/blob/main/results_2023-10-13T06-47-53.119042.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.009437919463087249, "em_stderr": 0.0009901902239103345, "f1": 0.059256501677852416, "f1_stderr": 0.0015878342558663697, "acc": 0.26992896606156275, "acc_stderr": 0.007003882714182583 }, "harness|drop|3": { "em": 0.009437919463087249, "em_stderr": 0.0009901902239103345, "f1": 0.059256501677852416, "f1_stderr": 0.0015878342558663697 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5398579321231255, "acc_stderr": 0.014007765428365166 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v2-774m
[ "region:us" ]
2023-08-18T10:33:43+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v2-774m", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-774m](https://huggingface.co/aisquared/dlite-v2-774m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v2-774m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T06:47:53.119042](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-774m/blob/main/results_2023-10-13T06-47-53.119042.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009437919463087249,\n \"em_stderr\": 0.0009901902239103345,\n \"f1\": 0.059256501677852416,\n \"f1_stderr\": 0.0015878342558663697,\n \"acc\": 0.26992896606156275,\n \"acc_stderr\": 0.007003882714182583\n },\n \"harness|drop|3\": {\n \"em\": 0.009437919463087249,\n \"em_stderr\": 0.0009901902239103345,\n \"f1\": 0.059256501677852416,\n \"f1_stderr\": 0.0015878342558663697\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5398579321231255,\n \"acc_stderr\": 0.014007765428365166\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v2-774m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T06_47_53.119042", "path": ["**/details_harness|drop|3_2023-10-13T06-47-53.119042.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T06-47-53.119042.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T06_47_53.119042", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-47-53.119042.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-47-53.119042.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:27:10.189986.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:27:10.189986.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T06_47_53.119042", "path": ["**/details_harness|winogrande|5_2023-10-13T06-47-53.119042.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T06-47-53.119042.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_27_10.189986", "path": ["results_2023-07-19T14:27:10.189986.parquet"]}, {"split": "2023_10_13T06_47_53.119042", "path": ["results_2023-10-13T06-47-53.119042.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T06-47-53.119042.parquet"]}]}]}
2023-10-13T05:48:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v2-774m ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v2-774m on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T06:47:53.119042(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v2-774m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-774m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:47:53.119042(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v2-774m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-774m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:47:53.119042(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v2-774m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-774m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T06:47:53.119042(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a0eec75582d7cf237308de4db4b104e44bbad8fa
# Dataset Card for Evaluation run of aisquared/chopt-2_7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/chopt-2_7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/chopt-2_7b](https://huggingface.co/aisquared/chopt-2_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__chopt-2_7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T00:33:45.271884](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__chopt-2_7b/blob/main/results_2023-10-13T00-33-45.271884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001363255033557047, "em_stderr": 0.0003778609196461222, "f1": 0.04857906879194641, "f1_stderr": 0.0012385170365466402, "acc": 0.2888713496448303, "acc_stderr": 0.006940791015329276 }, "harness|drop|3": { "em": 0.001363255033557047, "em_stderr": 0.0003778609196461222, "f1": 0.04857906879194641, "f1_stderr": 0.0012385170365466402 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5777426992896606, "acc_stderr": 0.013881582030658552 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__chopt-2_7b
[ "region:us" ]
2023-08-18T10:33:52+00:00
{"pretty_name": "Evaluation run of aisquared/chopt-2_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/chopt-2_7b](https://huggingface.co/aisquared/chopt-2_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__chopt-2_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T00:33:45.271884](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__chopt-2_7b/blob/main/results_2023-10-13T00-33-45.271884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196461222,\n \"f1\": 0.04857906879194641,\n \"f1_stderr\": 0.0012385170365466402,\n \"acc\": 0.2888713496448303,\n \"acc_stderr\": 0.006940791015329276\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196461222,\n \"f1\": 0.04857906879194641,\n \"f1_stderr\": 0.0012385170365466402\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5777426992896606,\n \"acc_stderr\": 0.013881582030658552\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/chopt-2_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T00_33_45.271884", "path": ["**/details_harness|drop|3_2023-10-13T00-33-45.271884.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T00-33-45.271884.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T00_33_45.271884", "path": ["**/details_harness|gsm8k|5_2023-10-13T00-33-45.271884.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T00-33-45.271884.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:07:47.560826.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:07:47.560826.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:07:47.560826.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T00_33_45.271884", "path": ["**/details_harness|winogrande|5_2023-10-13T00-33-45.271884.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T00-33-45.271884.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_07_47.560826", "path": ["results_2023-07-19T16:07:47.560826.parquet"]}, {"split": "2023_10_13T00_33_45.271884", "path": ["results_2023-10-13T00-33-45.271884.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T00-33-45.271884.parquet"]}]}]}
2023-10-12T23:33:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/chopt-2_7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/chopt-2_7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T00:33:45.271884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/chopt-2_7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/chopt-2_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T00:33:45.271884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/chopt-2_7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/chopt-2_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T00:33:45.271884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/chopt-2_7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/chopt-2_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T00:33:45.271884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3d7298438f48f5408e94f69ddd315bc6c9f21f52
# Dataset Card for Evaluation run of aisquared/dlite-v1-124m ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v1-124m - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-124m](https://huggingface.co/aisquared/dlite-v1-124m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v1-124m", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T05:52:16.762412](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-124m/blob/main/results_2023-10-17T05-52-16.762412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.011954697986577181, "em_stderr": 0.0011130056898859015, "f1": 0.0519830117449665, "f1_stderr": 0.0015990891614949285, "acc": 0.2509865824782952, "acc_stderr": 0.0070261881296128145 }, "harness|drop|3": { "em": 0.011954697986577181, "em_stderr": 0.0011130056898859015, "f1": 0.0519830117449665, "f1_stderr": 0.0015990891614949285 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5019731649565904, "acc_stderr": 0.014052376259225629 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v1-124m
[ "region:us" ]
2023-08-18T10:34:00+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v1-124m", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-124m](https://huggingface.co/aisquared/dlite-v1-124m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v1-124m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T05:52:16.762412](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-124m/blob/main/results_2023-10-17T05-52-16.762412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.011954697986577181,\n \"em_stderr\": 0.0011130056898859015,\n \"f1\": 0.0519830117449665,\n \"f1_stderr\": 0.0015990891614949285,\n \"acc\": 0.2509865824782952,\n \"acc_stderr\": 0.0070261881296128145\n },\n \"harness|drop|3\": {\n \"em\": 0.011954697986577181,\n \"em_stderr\": 0.0011130056898859015,\n \"f1\": 0.0519830117449665,\n \"f1_stderr\": 0.0015990891614949285\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225629\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v1-124m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T05_52_16.762412", "path": ["**/details_harness|drop|3_2023-10-17T05-52-16.762412.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T05-52-16.762412.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T05_52_16.762412", "path": ["**/details_harness|gsm8k|5_2023-10-17T05-52-16.762412.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T05-52-16.762412.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:09.752185.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:54:09.752185.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:54:09.752185.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T05_52_16.762412", "path": ["**/details_harness|winogrande|5_2023-10-17T05-52-16.762412.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T05-52-16.762412.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_54_09.752185", "path": ["results_2023-07-19T13:54:09.752185.parquet"]}, {"split": "2023_10_17T05_52_16.762412", "path": ["results_2023-10-17T05-52-16.762412.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T05-52-16.762412.parquet"]}]}]}
2023-10-17T04:52:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v1-124m ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v1-124m on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T05:52:16.762412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v1-124m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-124m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T05:52:16.762412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v1-124m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-124m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T05:52:16.762412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v1-124m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-124m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T05:52:16.762412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2f0086905b00c7d29011519a5a3a47318eae05ae
# Dataset Card for Evaluation run of aisquared/chopt-1_3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/chopt-1_3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/chopt-1_3b](https://huggingface.co/aisquared/chopt-1_3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__chopt-1_3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T02:11:14.117719](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__chopt-1_3b/blob/main/results_2023-10-25T02-11-14.117719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054710093, "f1": 0.046667365771812144, "f1_stderr": 0.0012971244615236355, "acc": 0.2912391475927388, "acc_stderr": 0.006929989132220124 }, "harness|drop|3": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054710093, "f1": 0.046667365771812144, "f1_stderr": 0.0012971244615236355 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5824782951854776, "acc_stderr": 0.013859978264440248 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__chopt-1_3b
[ "region:us" ]
2023-08-18T10:34:09+00:00
{"pretty_name": "Evaluation run of aisquared/chopt-1_3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/chopt-1_3b](https://huggingface.co/aisquared/chopt-1_3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__chopt-1_3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T02:11:14.117719](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__chopt-1_3b/blob/main/results_2023-10-25T02-11-14.117719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710093,\n \"f1\": 0.046667365771812144,\n \"f1_stderr\": 0.0012971244615236355,\n \"acc\": 0.2912391475927388,\n \"acc_stderr\": 0.006929989132220124\n },\n \"harness|drop|3\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710093,\n \"f1\": 0.046667365771812144,\n \"f1_stderr\": 0.0012971244615236355\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5824782951854776,\n \"acc_stderr\": 0.013859978264440248\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/chopt-1_3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T02_11_14.117719", "path": ["**/details_harness|drop|3_2023-10-25T02-11-14.117719.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T02-11-14.117719.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T02_11_14.117719", "path": ["**/details_harness|gsm8k|5_2023-10-25T02-11-14.117719.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T02-11-14.117719.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:06.685040.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:44:06.685040.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:44:06.685040.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T02_11_14.117719", "path": ["**/details_harness|winogrande|5_2023-10-25T02-11-14.117719.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T02-11-14.117719.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_44_06.685040", "path": ["results_2023-07-19T14:44:06.685040.parquet"]}, {"split": "2023_10_25T02_11_14.117719", "path": ["results_2023-10-25T02-11-14.117719.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T02-11-14.117719.parquet"]}]}]}
2023-10-25T01:11:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/chopt-1_3b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/chopt-1_3b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T02:11:14.117719(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/chopt-1_3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/chopt-1_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T02:11:14.117719(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/chopt-1_3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/chopt-1_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T02:11:14.117719(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/chopt-1_3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/chopt-1_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T02:11:14.117719(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7759daedbc333ae755fd31a123cd4624cbafaa9d
# Dataset Card for Evaluation run of aisquared/dlite-v2-1_5b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v2-1_5b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-1_5b](https://huggingface.co/aisquared/dlite-v2-1_5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v2-1_5b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T07:28:24.104795](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-1_5b/blob/main/results_2023-10-17T07-28-24.104795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0019924496644295304, "em_stderr": 0.0004566676462667015, "f1": 0.0503942953020135, "f1_stderr": 0.0012335220693783073, "acc": 0.2746178881540092, "acc_stderr": 0.007651262223507757 }, "harness|drop|3": { "em": 0.0019924496644295304, "em_stderr": 0.0004566676462667015, "f1": 0.0503942953020135, "f1_stderr": 0.0012335220693783073 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674103 }, "harness|winogrande|5": { "acc": 0.5469613259668509, "acc_stderr": 0.013990366632148104 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v2-1_5b
[ "region:us" ]
2023-08-18T10:34:17+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v2-1_5b", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-1_5b](https://huggingface.co/aisquared/dlite-v2-1_5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v2-1_5b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T07:28:24.104795](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-1_5b/blob/main/results_2023-10-17T07-28-24.104795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462667015,\n \"f1\": 0.0503942953020135,\n \"f1_stderr\": 0.0012335220693783073,\n \"acc\": 0.2746178881540092,\n \"acc_stderr\": 0.007651262223507757\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462667015,\n \"f1\": 0.0503942953020135,\n \"f1_stderr\": 0.0012335220693783073\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674103\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5469613259668509,\n \"acc_stderr\": 0.013990366632148104\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v2-1_5b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T07_28_24.104795", "path": ["**/details_harness|drop|3_2023-10-17T07-28-24.104795.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T07-28-24.104795.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T07_28_24.104795", "path": ["**/details_harness|gsm8k|5_2023-10-17T07-28-24.104795.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T07-28-24.104795.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:15:41.059925.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:15:41.059925.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:15:41.059925.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T07_28_24.104795", "path": ["**/details_harness|winogrande|5_2023-10-17T07-28-24.104795.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T07-28-24.104795.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_15_41.059925", "path": ["results_2023-07-18T11:15:41.059925.parquet"]}, {"split": "2023_10_17T07_28_24.104795", "path": ["results_2023-10-17T07-28-24.104795.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T07-28-24.104795.parquet"]}]}]}
2023-10-17T06:28:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v2-1_5b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v2-1_5b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T07:28:24.104795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v2-1_5b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-1_5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T07:28:24.104795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v2-1_5b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-1_5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T07:28:24.104795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v2-1_5b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-1_5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T07:28:24.104795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
065c046eb90811e1625a2bc740061a7932efa22f
# Dataset Card for Evaluation run of aisquared/dlite-v1-355m ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v1-355m - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-355m](https://huggingface.co/aisquared/dlite-v1-355m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v1-355m", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T20:11:22.634896](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-355m/blob/main/results_2023-10-27T20-11-22.634896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.009123322147651007, "em_stderr": 0.0009737017705541621, "f1": 0.05341862416107383, "f1_stderr": 0.0014844140427647057, "acc": 0.26400947119179163, "acc_stderr": 0.0070152021067028955 }, "harness|drop|3": { "em": 0.009123322147651007, "em_stderr": 0.0009737017705541621, "f1": 0.05341862416107383, "f1_stderr": 0.0014844140427647057 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5280189423835833, "acc_stderr": 0.014030404213405791 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v1-355m
[ "region:us" ]
2023-08-18T10:34:26+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v1-355m", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-355m](https://huggingface.co/aisquared/dlite-v1-355m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v1-355m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T20:11:22.634896](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-355m/blob/main/results_2023-10-27T20-11-22.634896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009123322147651007,\n \"em_stderr\": 0.0009737017705541621,\n \"f1\": 0.05341862416107383,\n \"f1_stderr\": 0.0014844140427647057,\n \"acc\": 0.26400947119179163,\n \"acc_stderr\": 0.0070152021067028955\n },\n \"harness|drop|3\": {\n \"em\": 0.009123322147651007,\n \"em_stderr\": 0.0009737017705541621,\n \"f1\": 0.05341862416107383,\n \"f1_stderr\": 0.0014844140427647057\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5280189423835833,\n \"acc_stderr\": 0.014030404213405791\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v1-355m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T20_11_22.634896", "path": ["**/details_harness|drop|3_2023-10-27T20-11-22.634896.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T20-11-22.634896.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T20_11_22.634896", "path": ["**/details_harness|gsm8k|5_2023-10-27T20-11-22.634896.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T20-11-22.634896.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:15:29.432225.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:15:29.432225.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:15:29.432225.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T20_11_22.634896", "path": ["**/details_harness|winogrande|5_2023-10-27T20-11-22.634896.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T20-11-22.634896.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_15_29.432225", "path": ["results_2023-07-19T14:15:29.432225.parquet"]}, {"split": "2023_10_27T20_11_22.634896", "path": ["results_2023-10-27T20-11-22.634896.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T20-11-22.634896.parquet"]}]}]}
2023-10-27T19:11:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v1-355m ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v1-355m on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T20:11:22.634896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v1-355m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-355m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T20:11:22.634896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v1-355m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-355m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T20:11:22.634896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v1-355m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-355m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T20:11:22.634896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4e65b725f722cfea3ad46b7403b812da6459218e
# Dataset Card for Evaluation run of aisquared/dlite-v2-124m ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v2-124m - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-124m](https://huggingface.co/aisquared/dlite-v2-124m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v2-124m", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T09:27:20.533537](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-124m/blob/main/results_2023-10-27T09-27-20.533537.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0050335570469798654, "em_stderr": 0.0007247385547751906, "f1": 0.05289324664429539, "f1_stderr": 0.001460860471625635, "acc": 0.2521704814522494, "acc_stderr": 0.007025978032038446 }, "harness|drop|3": { "em": 0.0050335570469798654, "em_stderr": 0.0007247385547751906, "f1": 0.05289324664429539, "f1_stderr": 0.001460860471625635 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5043409629044988, "acc_stderr": 0.014051956064076892 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v2-124m
[ "region:us" ]
2023-08-18T10:34:35+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v2-124m", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-124m](https://huggingface.co/aisquared/dlite-v2-124m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v2-124m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T09:27:20.533537](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-124m/blob/main/results_2023-10-27T09-27-20.533537.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0050335570469798654,\n \"em_stderr\": 0.0007247385547751906,\n \"f1\": 0.05289324664429539,\n \"f1_stderr\": 0.001460860471625635,\n \"acc\": 0.2521704814522494,\n \"acc_stderr\": 0.007025978032038446\n },\n \"harness|drop|3\": {\n \"em\": 0.0050335570469798654,\n \"em_stderr\": 0.0007247385547751906,\n \"f1\": 0.05289324664429539,\n \"f1_stderr\": 0.001460860471625635\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.014051956064076892\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v2-124m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T09_27_20.533537", "path": ["**/details_harness|drop|3_2023-10-27T09-27-20.533537.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T09-27-20.533537.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T09_27_20.533537", "path": ["**/details_harness|gsm8k|5_2023-10-27T09-27-20.533537.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T09-27-20.533537.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:53:19.147655.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:53:19.147655.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:53:19.147655.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T09_27_20.533537", "path": ["**/details_harness|winogrande|5_2023-10-27T09-27-20.533537.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T09-27-20.533537.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_53_19.147655", "path": ["results_2023-07-19T13:53:19.147655.parquet"]}, {"split": "2023_10_27T09_27_20.533537", "path": ["results_2023-10-27T09-27-20.533537.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T09-27-20.533537.parquet"]}]}]}
2023-10-27T08:27:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v2-124m ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v2-124m on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T09:27:20.533537(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v2-124m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-124m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T09:27:20.533537(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v2-124m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-124m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T09:27:20.533537(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v2-124m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v2-124m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T09:27:20.533537(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
dc00f82a6fa3cd9a968c33537727eb5727041434
# Dataset Card for Evaluation run of aisquared/dlite-v1-774m ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/aisquared/dlite-v1-774m - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-774m](https://huggingface.co/aisquared/dlite-v1-774m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v1-774m", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T09:49:41.867604](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-774m/blob/main/results_2023-10-18T09-49-41.867604.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.011220637583892617, "em_stderr": 0.0010786936337733937, "f1": 0.06615142617449662, "f1_stderr": 0.001688981547339462, "acc": 0.27308602999210735, "acc_stderr": 0.006996220781853532 }, "harness|drop|3": { "em": 0.011220637583892617, "em_stderr": 0.0010786936337733937, "f1": 0.06615142617449662, "f1_stderr": 0.001688981547339462 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5461720599842147, "acc_stderr": 0.013992441563707063 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_aisquared__dlite-v1-774m
[ "region:us" ]
2023-08-18T10:34:43+00:00
{"pretty_name": "Evaluation run of aisquared/dlite-v1-774m", "dataset_summary": "Dataset automatically created during the evaluation run of model [aisquared/dlite-v1-774m](https://huggingface.co/aisquared/dlite-v1-774m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v1-774m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T09:49:41.867604](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-774m/blob/main/results_2023-10-18T09-49-41.867604.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.011220637583892617,\n \"em_stderr\": 0.0010786936337733937,\n \"f1\": 0.06615142617449662,\n \"f1_stderr\": 0.001688981547339462,\n \"acc\": 0.27308602999210735,\n \"acc_stderr\": 0.006996220781853532\n },\n \"harness|drop|3\": {\n \"em\": 0.011220637583892617,\n \"em_stderr\": 0.0010786936337733937,\n \"f1\": 0.06615142617449662,\n \"f1_stderr\": 0.001688981547339462\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5461720599842147,\n \"acc_stderr\": 0.013992441563707063\n }\n}\n```", "repo_url": "https://huggingface.co/aisquared/dlite-v1-774m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T09_49_41.867604", "path": ["**/details_harness|drop|3_2023-10-18T09-49-41.867604.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T09-49-41.867604.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T09_49_41.867604", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-49-41.867604.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-49-41.867604.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:45.959233.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:26:45.959233.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:26:45.959233.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T09_49_41.867604", "path": ["**/details_harness|winogrande|5_2023-10-18T09-49-41.867604.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T09-49-41.867604.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_26_45.959233", "path": ["results_2023-07-19T14:26:45.959233.parquet"]}, {"split": "2023_10_18T09_49_41.867604", "path": ["results_2023-10-18T09-49-41.867604.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T09-49-41.867604.parquet"]}]}]}
2023-10-18T08:49:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aisquared/dlite-v1-774m ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model aisquared/dlite-v1-774m on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T09:49:41.867604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of aisquared/dlite-v1-774m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-774m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T09:49:41.867604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aisquared/dlite-v1-774m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-774m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T09:49:41.867604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aisquared/dlite-v1-774m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model aisquared/dlite-v1-774m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T09:49:41.867604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b1489c79b3b30f1637cf026fa9b3b5a848653f55
# Dataset Card for Evaluation run of BreadAi/DiscordPy ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/DiscordPy - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/DiscordPy](https://huggingface.co/BreadAi/DiscordPy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__DiscordPy", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T03:17:44.318630](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__DiscordPy/blob/main/results_2023-09-17T03-17-44.318630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01143036912751678, "em_stderr": 0.0010886127371891195, "f1": 0.02683305369127515, "f1_stderr": 0.0013566644608392164, "acc": 0.2549329123914759, "acc_stderr": 0.007024874916683796 }, "harness|drop|3": { "em": 0.01143036912751678, "em_stderr": 0.0010886127371891195, "f1": 0.02683305369127515, "f1_stderr": 0.0013566644608392164 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5098658247829518, "acc_stderr": 0.014049749833367592 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__DiscordPy
[ "region:us" ]
2023-08-18T10:34:52+00:00
{"pretty_name": "Evaluation run of BreadAi/DiscordPy", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/DiscordPy](https://huggingface.co/BreadAi/DiscordPy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__DiscordPy\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T03:17:44.318630](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__DiscordPy/blob/main/results_2023-09-17T03-17-44.318630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01143036912751678,\n \"em_stderr\": 0.0010886127371891195,\n \"f1\": 0.02683305369127515,\n \"f1_stderr\": 0.0013566644608392164,\n \"acc\": 0.2549329123914759,\n \"acc_stderr\": 0.007024874916683796\n },\n \"harness|drop|3\": {\n \"em\": 0.01143036912751678,\n \"em_stderr\": 0.0010886127371891195,\n \"f1\": 0.02683305369127515,\n \"f1_stderr\": 0.0013566644608392164\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5098658247829518,\n \"acc_stderr\": 0.014049749833367592\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/DiscordPy", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T03_17_44.318630", "path": ["**/details_harness|drop|3_2023-09-17T03-17-44.318630.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T03-17-44.318630.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T03_17_44.318630", "path": ["**/details_harness|gsm8k|5_2023-09-17T03-17-44.318630.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T03-17-44.318630.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:34.625744.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:21:34.625744.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:21:34.625744.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T03_17_44.318630", "path": ["**/details_harness|winogrande|5_2023-09-17T03-17-44.318630.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T03-17-44.318630.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_21_34.625744", "path": ["results_2023-07-19T19:21:34.625744.parquet"]}, {"split": "2023_09_17T03_17_44.318630", "path": ["results_2023-09-17T03-17-44.318630.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T03-17-44.318630.parquet"]}]}]}
2023-09-17T02:17:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/DiscordPy ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/DiscordPy on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T03:17:44.318630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/DiscordPy", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/DiscordPy on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T03:17:44.318630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/DiscordPy", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/DiscordPy on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T03:17:44.318630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/DiscordPy## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/DiscordPy on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T03:17:44.318630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7ac9c900f87461a7e374d018177ed17dd1e274a4
# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_160M ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/gpt-YA-1-1_160M - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/gpt-YA-1-1_160M](https://huggingface.co/BreadAi/gpt-YA-1-1_160M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_160M", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T23:15:39.203631](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_160M/blob/main/results_2023-09-16T23-15-39.203631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0037751677852348995, "em_stderr": 0.0006280387809484644, "f1": 0.02323720637583893, "f1_stderr": 0.0010591010187142242, "acc": 0.2533543804262036, "acc_stderr": 0.007025610346165173 }, "harness|drop|3": { "em": 0.0037751677852348995, "em_stderr": 0.0006280387809484644, "f1": 0.02323720637583893, "f1_stderr": 0.0010591010187142242 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5067087608524072, "acc_stderr": 0.014051220692330346 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_160M
[ "region:us" ]
2023-08-18T10:35:01+00:00
{"pretty_name": "Evaluation run of BreadAi/gpt-YA-1-1_160M", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/gpt-YA-1-1_160M](https://huggingface.co/BreadAi/gpt-YA-1-1_160M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_160M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T23:15:39.203631](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_160M/blob/main/results_2023-09-16T23-15-39.203631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0037751677852348995,\n \"em_stderr\": 0.0006280387809484644,\n \"f1\": 0.02323720637583893,\n \"f1_stderr\": 0.0010591010187142242,\n \"acc\": 0.2533543804262036,\n \"acc_stderr\": 0.007025610346165173\n },\n \"harness|drop|3\": {\n \"em\": 0.0037751677852348995,\n \"em_stderr\": 0.0006280387809484644,\n \"f1\": 0.02323720637583893,\n \"f1_stderr\": 0.0010591010187142242\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5067087608524072,\n \"acc_stderr\": 0.014051220692330346\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/gpt-YA-1-1_160M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T23_15_39.203631", "path": ["**/details_harness|drop|3_2023-09-16T23-15-39.203631.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T23-15-39.203631.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T23_15_39.203631", "path": ["**/details_harness|gsm8k|5_2023-09-16T23-15-39.203631.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T23-15-39.203631.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:02:10.207194.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:02:10.207194.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:02:10.207194.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T23_15_39.203631", "path": ["**/details_harness|winogrande|5_2023-09-16T23-15-39.203631.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T23-15-39.203631.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_02_10.207194", "path": ["results_2023-07-19T14:02:10.207194.parquet"]}, {"split": "2023_09_16T23_15_39.203631", "path": ["results_2023-09-16T23-15-39.203631.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T23-15-39.203631.parquet"]}]}]}
2023-09-16T22:15:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_160M ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_160M on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T23:15:39.203631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_160M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_160M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T23:15:39.203631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_160M", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_160M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T23:15:39.203631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_160M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-YA-1-1_160M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T23:15:39.203631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
74a4e5dd4407dfa8a82a56a5e6314c508c4718f2
# Dataset Card for Evaluation run of BreadAi/MusePy-1-2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/MusePy-1-2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/MusePy-1-2](https://huggingface.co/BreadAi/MusePy-1-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__MusePy-1-2", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-03T17:04:48.338074](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__MusePy-1-2/blob/main/results_2023-12-03T17-04-48.338074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__MusePy-1-2
[ "region:us" ]
2023-08-18T10:35:09+00:00
{"pretty_name": "Evaluation run of BreadAi/MusePy-1-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/MusePy-1-2](https://huggingface.co/BreadAi/MusePy-1-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__MusePy-1-2\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T17:04:48.338074](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__MusePy-1-2/blob/main/results_2023-12-03T17-04-48.338074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/MusePy-1-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T07_17_07.226410", "path": ["**/details_harness|drop|3_2023-10-25T07-17-07.226410.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T07-17-07.226410.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T07_17_07.226410", "path": ["**/details_harness|gsm8k|5_2023-10-25T07-17-07.226410.parquet"]}, {"split": "2023_12_03T17_04_48.338074", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-04-48.338074.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-04-48.338074.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:39:08.820966.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:39:08.820966.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T07_17_07.226410", "path": ["**/details_harness|winogrande|5_2023-10-25T07-17-07.226410.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T07-17-07.226410.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_39_08.820966", "path": ["results_2023-07-19T19:39:08.820966.parquet"]}, {"split": "2023_10_25T07_17_07.226410", "path": ["results_2023-10-25T07-17-07.226410.parquet"]}, {"split": "2023_12_03T17_04_48.338074", "path": ["results_2023-12-03T17-04-48.338074.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T17-04-48.338074.parquet"]}]}]}
2023-12-03T17:04:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/MusePy-1-2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/MusePy-1-2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-03T17:04:48.338074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/MusePy-1-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/MusePy-1-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T17:04:48.338074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/MusePy-1-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/MusePy-1-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T17:04:48.338074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/MusePy-1-2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/MusePy-1-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T17:04:48.338074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1d728971b1887677866432506d7820a8e69d7905
# Dataset Card for Evaluation run of BreadAi/PM_modelV2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/PM_modelV2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/PM_modelV2](https://huggingface.co/BreadAi/PM_modelV2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__PM_modelV2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T22:18:34.076880](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__PM_modelV2/blob/main/results_2023-09-16T22-18-34.076880.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0009437919463087249, "em_stderr": 0.00031446531194131066, "f1": 0.004450503355704698, "f1_stderr": 0.00041742876892271523, "acc": 0.2478295185477506, "acc_stderr": 0.007025978032038445 }, "harness|drop|3": { "em": 0.0009437919463087249, "em_stderr": 0.00031446531194131066, "f1": 0.004450503355704698, "f1_stderr": 0.00041742876892271523 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4956590370955012, "acc_stderr": 0.01405195606407689 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__PM_modelV2
[ "region:us" ]
2023-08-18T10:35:17+00:00
{"pretty_name": "Evaluation run of BreadAi/PM_modelV2", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/PM_modelV2](https://huggingface.co/BreadAi/PM_modelV2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__PM_modelV2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T22:18:34.076880](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__PM_modelV2/blob/main/results_2023-09-16T22-18-34.076880.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194131066,\n \"f1\": 0.004450503355704698,\n \"f1_stderr\": 0.00041742876892271523,\n \"acc\": 0.2478295185477506,\n \"acc_stderr\": 0.007025978032038445\n },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194131066,\n \"f1\": 0.004450503355704698,\n \"f1_stderr\": 0.00041742876892271523\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.01405195606407689\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/PM_modelV2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T22_18_34.076880", "path": ["**/details_harness|drop|3_2023-09-16T22-18-34.076880.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T22-18-34.076880.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T22_18_34.076880", "path": ["**/details_harness|gsm8k|5_2023-09-16T22-18-34.076880.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T22-18-34.076880.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:43.765981.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:16:43.765981.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:16:43.765981.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T22_18_34.076880", "path": ["**/details_harness|winogrande|5_2023-09-16T22-18-34.076880.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T22-18-34.076880.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_16_43.765981", "path": ["results_2023-07-19T19:16:43.765981.parquet"]}, {"split": "2023_09_16T22_18_34.076880", "path": ["results_2023-09-16T22-18-34.076880.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T22-18-34.076880.parquet"]}]}]}
2023-09-16T21:18:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/PM_modelV2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/PM_modelV2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T22:18:34.076880(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/PM_modelV2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/PM_modelV2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T22:18:34.076880(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/PM_modelV2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/PM_modelV2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T22:18:34.076880(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/PM_modelV2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/PM_modelV2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T22:18:34.076880(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ebbf7c2947e223f239b5f93772bbfa84f7d01efa
# Dataset Card for Evaluation run of BreadAi/MuseCan ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/MuseCan - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/MuseCan](https://huggingface.co/BreadAi/MuseCan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__MuseCan", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-03T17:02:11.200509](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__MuseCan/blob/main/results_2023-12-03T17-02-11.200509.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__MuseCan
[ "region:us" ]
2023-08-18T10:35:26+00:00
{"pretty_name": "Evaluation run of BreadAi/MuseCan", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/MuseCan](https://huggingface.co/BreadAi/MuseCan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__MuseCan\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T17:02:11.200509](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__MuseCan/blob/main/results_2023-12-03T17-02-11.200509.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/MuseCan", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T02_44_47.372597", "path": ["**/details_harness|drop|3_2023-10-13T02-44-47.372597.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T02-44-47.372597.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T02_44_47.372597", "path": ["**/details_harness|gsm8k|5_2023-10-13T02-44-47.372597.parquet"]}, {"split": "2023_12_03T17_02_11.200509", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-02-11.200509.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-02-11.200509.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:11.706174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:11.706174.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:11.706174.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T02_44_47.372597", "path": ["**/details_harness|winogrande|5_2023-10-13T02-44-47.372597.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T02-44-47.372597.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_29_11.706174", "path": ["results_2023-07-19T19:29:11.706174.parquet"]}, {"split": "2023_10_13T02_44_47.372597", "path": ["results_2023-10-13T02-44-47.372597.parquet"]}, {"split": "2023_12_03T17_02_11.200509", "path": ["results_2023-12-03T17-02-11.200509.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T17-02-11.200509.parquet"]}]}]}
2023-12-03T17:02:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/MuseCan ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/MuseCan on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-03T17:02:11.200509(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/MuseCan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/MuseCan on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T17:02:11.200509(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/MuseCan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/MuseCan on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T17:02:11.200509(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 16, 31, 165, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/MuseCan## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/MuseCan on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T17:02:11.200509(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
35eb94629b5ba76df8850f73cf767c2f08d89725
# Dataset Card for Evaluation run of BreadAi/gpt-Youtube ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/gpt-Youtube - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/gpt-Youtube](https://huggingface.co/BreadAi/gpt-Youtube) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__gpt-Youtube", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T17:05:46.246931](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-Youtube/blob/main/results_2023-09-16T17-05-46.246931.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.030411073825503357, "em_stderr": 0.0017585282619461873, "f1": 0.03320889261744967, "f1_stderr": 0.0017808043698679197, "acc": 0.244672454617206, "acc_stderr": 0.0070246472681452015 }, "harness|drop|3": { "em": 0.030411073825503357, "em_stderr": 0.0017585282619461873, "f1": 0.03320889261744967, "f1_stderr": 0.0017808043698679197 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.489344909234412, "acc_stderr": 0.014049294536290403 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__gpt-Youtube
[ "region:us" ]
2023-08-18T10:35:35+00:00
{"pretty_name": "Evaluation run of BreadAi/gpt-Youtube", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/gpt-Youtube](https://huggingface.co/BreadAi/gpt-Youtube) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__gpt-Youtube\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T17:05:46.246931](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-Youtube/blob/main/results_2023-09-16T17-05-46.246931.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.030411073825503357,\n \"em_stderr\": 0.0017585282619461873,\n \"f1\": 0.03320889261744967,\n \"f1_stderr\": 0.0017808043698679197,\n \"acc\": 0.244672454617206,\n \"acc_stderr\": 0.0070246472681452015\n },\n \"harness|drop|3\": {\n \"em\": 0.030411073825503357,\n \"em_stderr\": 0.0017585282619461873,\n \"f1\": 0.03320889261744967,\n \"f1_stderr\": 0.0017808043698679197\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.489344909234412,\n \"acc_stderr\": 0.014049294536290403\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/gpt-Youtube", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T17_05_46.246931", "path": ["**/details_harness|drop|3_2023-09-16T17-05-46.246931.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T17-05-46.246931.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T17_05_46.246931", "path": ["**/details_harness|gsm8k|5_2023-09-16T17-05-46.246931.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T17-05-46.246931.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:36:47.634434.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:36:47.634434.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:36:47.634434.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T17_05_46.246931", "path": ["**/details_harness|winogrande|5_2023-09-16T17-05-46.246931.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T17-05-46.246931.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_36_47.634434", "path": ["results_2023-07-19T19:36:47.634434.parquet"]}, {"split": "2023_09_16T17_05_46.246931", "path": ["results_2023-09-16T17-05-46.246931.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T17-05-46.246931.parquet"]}]}]}
2023-09-16T16:05:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/gpt-Youtube ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/gpt-Youtube on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T17:05:46.246931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/gpt-Youtube", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-Youtube on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T17:05:46.246931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/gpt-Youtube", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-Youtube on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T17:05:46.246931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/gpt-Youtube## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/gpt-Youtube on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T17:05:46.246931(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
87e4dca5469d327d8325945515dd54cd5496f33e
# Dataset Card for Evaluation run of BreadAi/StoryPy ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/BreadAi/StoryPy - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [BreadAi/StoryPy](https://huggingface.co/BreadAi/StoryPy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BreadAi__StoryPy", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T10:16:36.157284](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__StoryPy/blob/main/results_2023-09-23T10-16-36.157284.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0007340604026845638, "em_stderr": 0.00027736144573356746, "f1": 0.011790058724832235, "f1_stderr": 0.0007354126826155291, "acc": 0.255327545382794, "acc_stderr": 0.007024647268145198 }, "harness|drop|3": { "em": 0.0007340604026845638, "em_stderr": 0.00027736144573356746, "f1": 0.011790058724832235, "f1_stderr": 0.0007354126826155291 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.510655090765588, "acc_stderr": 0.014049294536290396 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_BreadAi__StoryPy
[ "region:us" ]
2023-08-18T10:35:44+00:00
{"pretty_name": "Evaluation run of BreadAi/StoryPy", "dataset_summary": "Dataset automatically created during the evaluation run of model [BreadAi/StoryPy](https://huggingface.co/BreadAi/StoryPy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__StoryPy\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T10:16:36.157284](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__StoryPy/blob/main/results_2023-09-23T10-16-36.157284.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573356746,\n \"f1\": 0.011790058724832235,\n \"f1_stderr\": 0.0007354126826155291,\n \"acc\": 0.255327545382794,\n \"acc_stderr\": 0.007024647268145198\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573356746,\n \"f1\": 0.011790058724832235,\n \"f1_stderr\": 0.0007354126826155291\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.014049294536290396\n }\n}\n```", "repo_url": "https://huggingface.co/BreadAi/StoryPy", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T10_16_36.157284", "path": ["**/details_harness|drop|3_2023-09-23T10-16-36.157284.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T10-16-36.157284.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T10_16_36.157284", "path": ["**/details_harness|gsm8k|5_2023-09-23T10-16-36.157284.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T10-16-36.157284.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:25:02.732559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:25:02.732559.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:25:02.732559.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T10_16_36.157284", "path": ["**/details_harness|winogrande|5_2023-09-23T10-16-36.157284.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T10-16-36.157284.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T10_25_02.732559", "path": ["results_2023-07-19T10:25:02.732559.parquet"]}, {"split": "2023_09_23T10_16_36.157284", "path": ["results_2023-09-23T10-16-36.157284.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T10-16-36.157284.parquet"]}]}]}
2023-09-23T09:16:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BreadAi/StoryPy ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model BreadAi/StoryPy on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T10:16:36.157284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of BreadAi/StoryPy", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/StoryPy on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T10:16:36.157284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BreadAi/StoryPy", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/StoryPy on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T10:16:36.157284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of BreadAi/StoryPy## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model BreadAi/StoryPy on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T10:16:36.157284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9b515671d651ac12e37853082d2da9884d234624
# Dataset Card for Evaluation run of duliadotio/dulia-13b-8k-alpha ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/duliadotio/dulia-13b-8k-alpha - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [duliadotio/dulia-13b-8k-alpha](https://huggingface.co/duliadotio/dulia-13b-8k-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_duliadotio__dulia-13b-8k-alpha", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T01:08:45.800403](https://huggingface.co/datasets/open-llm-leaderboard/details_duliadotio__dulia-13b-8k-alpha/blob/main/results_2023-09-18T01-08-45.800403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07822986577181208, "em_stderr": 0.002750028914814403, "f1": 0.17715289429530226, "f1_stderr": 0.0031326349334958943, "acc": 0.439400648526514, "acc_stderr": 0.010151999191592032 }, "harness|drop|3": { "em": 0.07822986577181208, "em_stderr": 0.002750028914814403, "f1": 0.17715289429530226, "f1_stderr": 0.0031326349334958943 }, "harness|gsm8k|5": { "acc": 0.1068991660348749, "acc_stderr": 0.008510982565520474 }, "harness|winogrande|5": { "acc": 0.7719021310181531, "acc_stderr": 0.011793015817663592 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_duliadotio__dulia-13b-8k-alpha
[ "region:us" ]
2023-08-18T10:35:52+00:00
{"pretty_name": "Evaluation run of duliadotio/dulia-13b-8k-alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [duliadotio/dulia-13b-8k-alpha](https://huggingface.co/duliadotio/dulia-13b-8k-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_duliadotio__dulia-13b-8k-alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T01:08:45.800403](https://huggingface.co/datasets/open-llm-leaderboard/details_duliadotio__dulia-13b-8k-alpha/blob/main/results_2023-09-18T01-08-45.800403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07822986577181208,\n \"em_stderr\": 0.002750028914814403,\n \"f1\": 0.17715289429530226,\n \"f1_stderr\": 0.0031326349334958943,\n \"acc\": 0.439400648526514,\n \"acc_stderr\": 0.010151999191592032\n },\n \"harness|drop|3\": {\n \"em\": 0.07822986577181208,\n \"em_stderr\": 0.002750028914814403,\n \"f1\": 0.17715289429530226,\n \"f1_stderr\": 0.0031326349334958943\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \"acc_stderr\": 0.008510982565520474\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n }\n}\n```", "repo_url": "https://huggingface.co/duliadotio/dulia-13b-8k-alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T01_08_45.800403", "path": ["**/details_harness|drop|3_2023-09-18T01-08-45.800403.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T01-08-45.800403.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T01_08_45.800403", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-08-45.800403.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-08-45.800403.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:33:18.083573.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:33:18.083573.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:33:18.083573.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T01_08_45.800403", "path": ["**/details_harness|winogrande|5_2023-09-18T01-08-45.800403.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T01-08-45.800403.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T21_33_18.083573", "path": ["results_2023-08-09T21:33:18.083573.parquet"]}, {"split": "2023_09_18T01_08_45.800403", "path": ["results_2023-09-18T01-08-45.800403.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T01-08-45.800403.parquet"]}]}]}
2023-09-18T00:08:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of duliadotio/dulia-13b-8k-alpha ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model duliadotio/dulia-13b-8k-alpha on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T01:08:45.800403(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of duliadotio/dulia-13b-8k-alpha", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model duliadotio/dulia-13b-8k-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T01:08:45.800403(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of duliadotio/dulia-13b-8k-alpha", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model duliadotio/dulia-13b-8k-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T01:08:45.800403(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of duliadotio/dulia-13b-8k-alpha## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model duliadotio/dulia-13b-8k-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T01:08:45.800403(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e4d4a916f66956259e5bfdddbd6e8167afb20412
# Dataset of sekibanki/赤蛮奇 (Touhou) This is the dataset of sekibanki/赤蛮奇 (Touhou), containing 500 images and their tags. The core tags of this character are `bow, red_hair, hair_bow, short_hair, red_eyes, blue_bow, bangs, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 519.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sekibanki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 317.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sekibanki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1133 | 653.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sekibanki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 466.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sekibanki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1133 | 892.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sekibanki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sekibanki_touhou', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_shirt, long_sleeves, miniskirt, pleated_skirt, red_skirt, solo, white_background, simple_background, hair_between_eyes, looking_at_viewer, red_cape, cowboy_shot, red_cloak, blush, covered_mouth | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, cape, long_sleeves, looking_at_viewer, shirt, skirt, solo | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cape, covered_mouth, solo, looking_at_viewer, blush, simple_background | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, disembodied_head, solo, white_background, simple_background, cape, >_<, closed_eyes, open_mouth | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, black_bodysuit, black_gloves, long_sleeves, looking_at_viewer, solo, hair_between_eyes, medium_breasts, simple_background, yellow_background, character_name, covered_navel, high_collar, jacket, open_mouth, standing, zipper_pull_tab | | 5 | 11 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_gloves, long_sleeves, looking_at_viewer, solo, open_mouth, black_bodysuit, blush, fang, :d, black_footwear, boots, cowboy_shot, red_cape, covered_navel, medium_breasts | | 6 | 19 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, blush, nipples, looking_at_viewer, navel, 1boy, hetero, open_mouth, pov, solo_focus, vaginal, penis, sweat, hair_between_eyes, large_breasts, nude, girl_on_top, :d, breasts_apart, cowgirl_position, cum_in_pussy, mosaic_censoring, red_cape, spread_legs, bar_censor, happy_sex, naked_cape | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1boy, 1girl, blush, hetero, large_breasts, penis, solo_focus, huge_breasts, long_sleeves, nipples, paizuri, bar_censor, open_mouth, pov, cape, cum_on_breasts, ejaculation, heart, looking_at_viewer, shirt, simple_background, smile, sweat, tongue | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | fellatio, penis, 1girl, blush, disembodied_head, solo_focus, futa_with_female, holding_head, sweat, 1boy, cape, hetero, looking_at_viewer | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shirt | long_sleeves | miniskirt | pleated_skirt | red_skirt | solo | white_background | simple_background | hair_between_eyes | looking_at_viewer | red_cape | cowboy_shot | red_cloak | blush | covered_mouth | cape | shirt | skirt | disembodied_head | >_< | closed_eyes | open_mouth | black_bodysuit | black_gloves | medium_breasts | yellow_background | character_name | covered_navel | high_collar | jacket | standing | zipper_pull_tab | fang | :d | black_footwear | boots | nipples | navel | 1boy | hetero | pov | solo_focus | vaginal | penis | sweat | large_breasts | nude | girl_on_top | breasts_apart | cowgirl_position | cum_in_pussy | mosaic_censoring | spread_legs | bar_censor | happy_sex | naked_cape | huge_breasts | paizuri | cum_on_breasts | ejaculation | heart | smile | tongue | fellatio | futa_with_female | holding_head | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:---------------|:------------|:----------------|:------------|:-------|:-------------------|:--------------------|:--------------------|:--------------------|:-----------|:--------------|:------------|:--------|:----------------|:-------|:--------|:--------|:-------------------|:------|:--------------|:-------------|:-----------------|:---------------|:-----------------|:--------------------|:-----------------|:----------------|:--------------|:---------|:-----------|:------------------|:-------|:-----|:-----------------|:--------|:----------|:--------|:-------|:---------|:------|:-------------|:----------|:--------|:--------|:----------------|:-------|:--------------|:----------------|:-------------------|:---------------|:-------------------|:--------------|:-------------|:------------|:-------------|:---------------|:----------|:-----------------|:--------------|:--------|:--------|:---------|:-----------|:-------------------|:---------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | | | X | | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | | | X | | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | | | X | X | X | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | | | X | | X | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 11 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | | X | | | | X | X | X | | X | | | | | | | | X | X | X | X | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 19 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | | | | | | | X | X | X | | | X | | | | | | | | X | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | | | | | X | | X | | | | X | | X | X | | | | | X | | | | | | | | | | | | | | | X | | X | X | X | X | | X | X | X | | | | | | | | X | | | X | X | X | X | X | X | X | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | | | | | | | X | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | X | X | X |
CyberHarem/sekibanki_touhou
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-08-18T10:35:59+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-14T18:13:17+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of sekibanki/赤蛮奇 (Touhou) ================================= This is the dataset of sekibanki/赤蛮奇 (Touhou), containing 500 images and their tags. The core tags of this character are 'bow, red\_hair, hair\_bow, short\_hair, red\_eyes, blue\_bow, bangs, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
2902d0fe1e4ac3e80603555d1ac9443273f3ca3a
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-do2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/llama-13b-pretrained-sft-do2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained-sft-do2](https://huggingface.co/dvruette/llama-13b-pretrained-sft-do2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-do2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-21T22:43:14.661061](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-do2/blob/main/results_2023-10-21T22-43-14.661061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2528313758389262, "em_stderr": 0.004451070247505258, "f1": 0.3196245805369137, "f1_stderr": 0.004416910326006887, "acc": 0.42391092962847055, "acc_stderr": 0.010031261264359954 }, "harness|drop|3": { "em": 0.2528313758389262, "em_stderr": 0.004451070247505258, "f1": 0.3196245805369137, "f1_stderr": 0.004416910326006887 }, "harness|gsm8k|5": { "acc": 0.09249431387414708, "acc_stderr": 0.007980396874560168 }, "harness|winogrande|5": { "acc": 0.755327545382794, "acc_stderr": 0.012082125654159738 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-do2
[ "region:us" ]
2023-08-18T10:36:01+00:00
{"pretty_name": "Evaluation run of dvruette/llama-13b-pretrained-sft-do2", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained-sft-do2](https://huggingface.co/dvruette/llama-13b-pretrained-sft-do2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-do2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T22:43:14.661061](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-do2/blob/main/results_2023-10-21T22-43-14.661061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2528313758389262,\n \"em_stderr\": 0.004451070247505258,\n \"f1\": 0.3196245805369137,\n \"f1_stderr\": 0.004416910326006887,\n \"acc\": 0.42391092962847055,\n \"acc_stderr\": 0.010031261264359954\n },\n \"harness|drop|3\": {\n \"em\": 0.2528313758389262,\n \"em_stderr\": 0.004451070247505258,\n \"f1\": 0.3196245805369137,\n \"f1_stderr\": 0.004416910326006887\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09249431387414708,\n \"acc_stderr\": 0.007980396874560168\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/llama-13b-pretrained-sft-do2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T22_43_14.661061", "path": ["**/details_harness|drop|3_2023-10-21T22-43-14.661061.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T22-43-14.661061.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T22_43_14.661061", "path": ["**/details_harness|gsm8k|5_2023-10-21T22-43-14.661061.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T22-43-14.661061.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:57:06.342295.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:57:06.342295.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:57:06.342295.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T22_43_14.661061", "path": ["**/details_harness|winogrande|5_2023-10-21T22-43-14.661061.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T22-43-14.661061.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_57_06.342295", "path": ["results_2023-07-19T18:57:06.342295.parquet"]}, {"split": "2023_10_21T22_43_14.661061", "path": ["results_2023-10-21T22-43-14.661061.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T22-43-14.661061.parquet"]}]}]}
2023-10-21T21:43:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-do2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-do2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-21T22:43:14.661061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-do2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-do2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-21T22:43:14.661061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-do2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-do2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-21T22:43:14.661061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-do2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-do2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T22:43:14.661061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5d7548b7e5d243a88e0cf116b92fffc4b5b3e5a0
# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/oasst-gpt-neox-20b-3000-steps](https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T10:27:17.935969](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps/blob/main/results_2023-09-17T10-27-17.935969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.03261325503355705, "em_stderr": 0.0018190171380944452, "f1": 0.08462353187919494, "f1_stderr": 0.0021613692798517184, "acc": 0.35813018759877047, "acc_stderr": 0.008817523952258153 }, "harness|drop|3": { "em": 0.03261325503355705, "em_stderr": 0.0018190171380944452, "f1": 0.08462353187919494, "f1_stderr": 0.0021613692798517184 }, "harness|gsm8k|5": { "acc": 0.02880970432145565, "acc_stderr": 0.00460748428376746 }, "harness|winogrande|5": { "acc": 0.6874506708760852, "acc_stderr": 0.013027563620748847 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps
[ "region:us" ]
2023-08-18T10:36:10+00:00
{"pretty_name": "Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/oasst-gpt-neox-20b-3000-steps](https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T10:27:17.935969](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-3000-steps/blob/main/results_2023-09-17T10-27-17.935969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03261325503355705,\n \"em_stderr\": 0.0018190171380944452,\n \"f1\": 0.08462353187919494,\n \"f1_stderr\": 0.0021613692798517184,\n \"acc\": 0.35813018759877047,\n \"acc_stderr\": 0.008817523952258153\n },\n \"harness|drop|3\": {\n \"em\": 0.03261325503355705,\n \"em_stderr\": 0.0018190171380944452,\n \"f1\": 0.08462353187919494,\n \"f1_stderr\": 0.0021613692798517184\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02880970432145565,\n \"acc_stderr\": 0.00460748428376746\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6874506708760852,\n \"acc_stderr\": 0.013027563620748847\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/oasst-gpt-neox-20b-3000-steps", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T10_27_17.935969", "path": ["**/details_harness|drop|3_2023-09-17T10-27-17.935969.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T10-27-17.935969.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T10_27_17.935969", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-27-17.935969.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-27-17.935969.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:33:10.003072.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:33:10.003072.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T10_27_17.935969", "path": ["**/details_harness|winogrande|5_2023-09-17T10-27-17.935969.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T10-27-17.935969.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T21_33_10.003072", "path": ["results_2023-07-19T21:33:10.003072.parquet"]}, {"split": "2023_09_17T10_27_17.935969", "path": ["results_2023-09-17T10-27-17.935969.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T10-27-17.935969.parquet"]}]}]}
2023-09-17T09:27:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-3000-steps on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T10:27:17.935969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-3000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T10:27:17.935969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-3000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T10:27:17.935969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-3000-steps## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-3000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T10:27:17.935969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
155da2a876771e20d525ce7bee04257ae111521b
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-dropout ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/llama-13b-pretrained-dropout - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained-dropout](https://huggingface.co/dvruette/llama-13b-pretrained-dropout) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__llama-13b-pretrained-dropout", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T13:29:36.249394](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-dropout/blob/main/results_2023-10-18T13-29-36.249394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.20501258389261745, "em_stderr": 0.0041343766395959035, "f1": 0.2702611157718119, "f1_stderr": 0.004144727885990915, "acc": 0.43522094959648105, "acc_stderr": 0.01051473093615015 }, "harness|drop|3": { "em": 0.20501258389261745, "em_stderr": 0.0041343766395959035, "f1": 0.2702611157718119, "f1_stderr": 0.004144727885990915 }, "harness|gsm8k|5": { "acc": 0.11827141774071266, "acc_stderr": 0.008895075852434953 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.01213438601986535 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__llama-13b-pretrained-dropout
[ "region:us" ]
2023-08-18T10:36:18+00:00
{"pretty_name": "Evaluation run of dvruette/llama-13b-pretrained-dropout", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained-dropout](https://huggingface.co/dvruette/llama-13b-pretrained-dropout) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__llama-13b-pretrained-dropout\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T13:29:36.249394](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-dropout/blob/main/results_2023-10-18T13-29-36.249394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20501258389261745,\n \"em_stderr\": 0.0041343766395959035,\n \"f1\": 0.2702611157718119,\n \"f1_stderr\": 0.004144727885990915,\n \"acc\": 0.43522094959648105,\n \"acc_stderr\": 0.01051473093615015\n },\n \"harness|drop|3\": {\n \"em\": 0.20501258389261745,\n \"em_stderr\": 0.0041343766395959035,\n \"f1\": 0.2702611157718119,\n \"f1_stderr\": 0.004144727885990915\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11827141774071266,\n \"acc_stderr\": 0.008895075852434953\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/llama-13b-pretrained-dropout", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T13_29_36.249394", "path": ["**/details_harness|drop|3_2023-10-18T13-29-36.249394.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T13-29-36.249394.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T13_29_36.249394", "path": ["**/details_harness|gsm8k|5_2023-10-18T13-29-36.249394.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T13-29-36.249394.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:40:51.054216.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:40:51.054216.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:40:51.054216.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T13_29_36.249394", "path": ["**/details_harness|winogrande|5_2023-10-18T13-29-36.249394.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T13-29-36.249394.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_40_51.054216", "path": ["results_2023-07-19T19:40:51.054216.parquet"]}, {"split": "2023_10_18T13_29_36.249394", "path": ["results_2023-10-18T13-29-36.249394.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T13-29-36.249394.parquet"]}]}]}
2023-10-18T12:29:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-dropout ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-dropout on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T13:29:36.249394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-dropout", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-dropout on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T13:29:36.249394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-dropout", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-dropout on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T13:29:36.249394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-dropout## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-dropout on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T13:29:36.249394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6ee7c21be4921274f3e238191867ec72a4490e62
# Dataset Card for Evaluation run of dvruette/oasst-pythia-6.9b-4000-steps ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/oasst-pythia-6.9b-4000-steps - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/oasst-pythia-6.9b-4000-steps](https://huggingface.co/dvruette/oasst-pythia-6.9b-4000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-pythia-6.9b-4000-steps", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T19:21:18.412004](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-6.9b-4000-steps/blob/main/results_2023-10-16T19-21-18.412004.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0436241610738255, "em_stderr": 0.0020917871626346383, "f1": 0.09968435402684558, "f1_stderr": 0.0024193671447343803, "acc": 0.3116511576000809, "acc_stderr": 0.007828441968459233 }, "harness|drop|3": { "em": 0.0436241610738255, "em_stderr": 0.0020917871626346383, "f1": 0.09968435402684558, "f1_stderr": 0.0024193671447343803 }, "harness|gsm8k|5": { "acc": 0.00530705079605762, "acc_stderr": 0.0020013057209480427 }, "harness|winogrande|5": { "acc": 0.6179952644041041, "acc_stderr": 0.013655578215970424 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__oasst-pythia-6.9b-4000-steps
[ "region:us" ]
2023-08-18T10:36:27+00:00
{"pretty_name": "Evaluation run of dvruette/oasst-pythia-6.9b-4000-steps", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/oasst-pythia-6.9b-4000-steps](https://huggingface.co/dvruette/oasst-pythia-6.9b-4000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-pythia-6.9b-4000-steps\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T19:21:18.412004](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-6.9b-4000-steps/blob/main/results_2023-10-16T19-21-18.412004.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0436241610738255,\n \"em_stderr\": 0.0020917871626346383,\n \"f1\": 0.09968435402684558,\n \"f1_stderr\": 0.0024193671447343803,\n \"acc\": 0.3116511576000809,\n \"acc_stderr\": 0.007828441968459233\n },\n \"harness|drop|3\": {\n \"em\": 0.0436241610738255,\n \"em_stderr\": 0.0020917871626346383,\n \"f1\": 0.09968435402684558,\n \"f1_stderr\": 0.0024193671447343803\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.0020013057209480427\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6179952644041041,\n \"acc_stderr\": 0.013655578215970424\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/oasst-pythia-6.9b-4000-steps", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T19_21_18.412004", "path": ["**/details_harness|drop|3_2023-10-16T19-21-18.412004.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T19-21-18.412004.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T19_21_18.412004", "path": ["**/details_harness|gsm8k|5_2023-10-16T19-21-18.412004.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T19-21-18.412004.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:39:14.734734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:39:14.734734.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:39:14.734734.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T19_21_18.412004", "path": ["**/details_harness|winogrande|5_2023-10-16T19-21-18.412004.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T19-21-18.412004.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_39_14.734734", "path": ["results_2023-07-19T17:39:14.734734.parquet"]}, {"split": "2023_10_16T19_21_18.412004", "path": ["results_2023-10-16T19-21-18.412004.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T19-21-18.412004.parquet"]}]}]}
2023-10-16T18:21:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/oasst-pythia-6.9b-4000-steps ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/oasst-pythia-6.9b-4000-steps on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T19:21:18.412004(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/oasst-pythia-6.9b-4000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-pythia-6.9b-4000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T19:21:18.412004(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/oasst-pythia-6.9b-4000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-pythia-6.9b-4000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T19:21:18.412004(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/oasst-pythia-6.9b-4000-steps## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-pythia-6.9b-4000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T19:21:18.412004(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d61f85e3ad0dd19f594035cdc8ee88e97b71b3ae
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained-sft-epoch-1](https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T22:06:45.407147](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1/blob/main/results_2023-10-18T22-06-45.407147.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2225251677852349, "em_stderr": 0.004259635026591598, "f1": 0.287082634228188, "f1_stderr": 0.004255345667621572, "acc": 0.45729496587127727, "acc_stderr": 0.01062102533078612 }, "harness|drop|3": { "em": 0.2225251677852349, "em_stderr": 0.004259635026591598, "f1": 0.287082634228188, "f1_stderr": 0.004255345667621572 }, "harness|gsm8k|5": { "acc": 0.13874147081122062, "acc_stderr": 0.009521649920798148 }, "harness|winogrande|5": { "acc": 0.7758484609313339, "acc_stderr": 0.011720400740774092 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1
[ "region:us" ]
2023-08-18T10:36:35+00:00
{"pretty_name": "Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained-sft-epoch-1](https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T22:06:45.407147](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1/blob/main/results_2023-10-18T22-06-45.407147.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2225251677852349,\n \"em_stderr\": 0.004259635026591598,\n \"f1\": 0.287082634228188,\n \"f1_stderr\": 0.004255345667621572,\n \"acc\": 0.45729496587127727,\n \"acc_stderr\": 0.01062102533078612\n },\n \"harness|drop|3\": {\n \"em\": 0.2225251677852349,\n \"em_stderr\": 0.004259635026591598,\n \"f1\": 0.287082634228188,\n \"f1_stderr\": 0.004255345667621572\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13874147081122062,\n \"acc_stderr\": 0.009521649920798148\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774092\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T22_06_45.407147", "path": ["**/details_harness|drop|3_2023-10-18T22-06-45.407147.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T22-06-45.407147.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T22_06_45.407147", "path": ["**/details_harness|gsm8k|5_2023-10-18T22-06-45.407147.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T22-06-45.407147.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:41:46.574881.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:41:46.574881.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T22_06_45.407147", "path": ["**/details_harness|winogrande|5_2023-10-18T22-06-45.407147.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T22-06-45.407147.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_41_46.574881", "path": ["results_2023-07-19T19:41:46.574881.parquet"]}, {"split": "2023_10_18T22_06_45.407147", "path": ["results_2023-10-18T22-06-45.407147.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T22-06-45.407147.parquet"]}]}]}
2023-10-18T21:06:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-epoch-1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T22:06:45.407147(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-epoch-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T22:06:45.407147(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-epoch-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T22:06:45.407147(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained-sft-epoch-1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T22:06:45.407147(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7d93cd095c8de326b8394b54a3603df8f10cab28
# Dataset Card for Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/oasst-pythia-12b-pretrained-sft](https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T17:50:05.517714](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft/blob/main/results_2023-10-28T17-50-05.517714.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001363255033557047, "em_stderr": 0.00037786091964609, "f1": 0.059786073825503584, "f1_stderr": 0.001416388770967041, "acc": 0.34960952576423865, "acc_stderr": 0.00936606058645266 }, "harness|drop|3": { "em": 0.001363255033557047, "em_stderr": 0.00037786091964609, "f1": 0.059786073825503584, "f1_stderr": 0.001416388770967041 }, "harness|gsm8k|5": { "acc": 0.0401819560272934, "acc_stderr": 0.00540943973697052 }, "harness|winogrande|5": { "acc": 0.659037095501184, "acc_stderr": 0.0133226814359348 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft
[ "region:us" ]
2023-08-18T10:36:43+00:00
{"pretty_name": "Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/oasst-pythia-12b-pretrained-sft](https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T17:50:05.517714](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft/blob/main/results_2023-10-28T17-50-05.517714.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964609,\n \"f1\": 0.059786073825503584,\n \"f1_stderr\": 0.001416388770967041,\n \"acc\": 0.34960952576423865,\n \"acc_stderr\": 0.00936606058645266\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964609,\n \"f1\": 0.059786073825503584,\n \"f1_stderr\": 0.001416388770967041\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \"acc_stderr\": 0.00540943973697052\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.659037095501184,\n \"acc_stderr\": 0.0133226814359348\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T17_50_05.517714", "path": ["**/details_harness|drop|3_2023-10-28T17-50-05.517714.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T17-50-05.517714.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T17_50_05.517714", "path": ["**/details_harness|gsm8k|5_2023-10-28T17-50-05.517714.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T17-50-05.517714.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:03:03.088618.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:03:03.088618.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T17_50_05.517714", "path": ["**/details_harness|winogrande|5_2023-10-28T17-50-05.517714.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T17-50-05.517714.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_03_03.088618", "path": ["results_2023-07-19T18:03:03.088618.parquet"]}, {"split": "2023_10_28T17_50_05.517714", "path": ["results_2023-10-28T17-50-05.517714.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T17-50-05.517714.parquet"]}]}]}
2023-10-28T16:50:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/oasst-pythia-12b-pretrained-sft on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T17:50:05.517714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-pythia-12b-pretrained-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T17:50:05.517714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-pythia-12b-pretrained-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T17:50:05.517714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-pythia-12b-pretrained-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T17:50:05.517714(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
73ae4bcbc687d8279f4eeed2e79aaae9092a3c2d
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/llama-13b-pretrained - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained](https://huggingface.co/dvruette/llama-13b-pretrained) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__llama-13b-pretrained", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T17:33:50.415201](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained/blob/main/results_2023-10-18T17-33-50.415201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.19431627516778524, "em_stderr": 0.004052066229872751, "f1": 0.25224412751677777, "f1_stderr": 0.004066214952392991, "acc": 0.46513107858970915, "acc_stderr": 0.01097629037543693 }, "harness|drop|3": { "em": 0.19431627516778524, "em_stderr": 0.004052066229872751, "f1": 0.25224412751677777, "f1_stderr": 0.004066214952392991 }, "harness|gsm8k|5": { "acc": 0.1607278241091736, "acc_stderr": 0.010116708586037183 }, "harness|winogrande|5": { "acc": 0.7695343330702447, "acc_stderr": 0.011835872164836676 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__llama-13b-pretrained
[ "region:us" ]
2023-08-18T10:36:52+00:00
{"pretty_name": "Evaluation run of dvruette/llama-13b-pretrained", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained](https://huggingface.co/dvruette/llama-13b-pretrained) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__llama-13b-pretrained\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T17:33:50.415201](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained/blob/main/results_2023-10-18T17-33-50.415201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19431627516778524,\n \"em_stderr\": 0.004052066229872751,\n \"f1\": 0.25224412751677777,\n \"f1_stderr\": 0.004066214952392991,\n \"acc\": 0.46513107858970915,\n \"acc_stderr\": 0.01097629037543693\n },\n \"harness|drop|3\": {\n \"em\": 0.19431627516778524,\n \"em_stderr\": 0.004052066229872751,\n \"f1\": 0.25224412751677777,\n \"f1_stderr\": 0.004066214952392991\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1607278241091736,\n \"acc_stderr\": 0.010116708586037183\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/llama-13b-pretrained", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T17_33_50.415201", "path": ["**/details_harness|drop|3_2023-10-18T17-33-50.415201.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T17-33-50.415201.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T17_33_50.415201", "path": ["**/details_harness|gsm8k|5_2023-10-18T17-33-50.415201.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T17-33-50.415201.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:55:00.882635.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:55:00.882635.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T17_33_50.415201", "path": ["**/details_harness|winogrande|5_2023-10-18T17-33-50.415201.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T17-33-50.415201.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_55_00.882635", "path": ["results_2023-07-19T18:55:00.882635.parquet"]}, {"split": "2023_10_18T17_33_50.415201", "path": ["results_2023-10-18T17-33-50.415201.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T17-33-50.415201.parquet"]}]}]}
2023-10-18T16:34:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T17:33:50.415201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T17:33:50.415201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T17:33:50.415201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/llama-13b-pretrained on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T17:33:50.415201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
59ce03f3a6b1092999dfeb7a3da68a7f1b1a018d
# Dataset Card for Evaluation run of dvruette/gpt-neox-20b-full-precision ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/gpt-neox-20b-full-precision - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/gpt-neox-20b-full-precision](https://huggingface.co/dvruette/gpt-neox-20b-full-precision) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__gpt-neox-20b-full-precision", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T10:49:53.793437](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__gpt-neox-20b-full-precision/blob/main/results_2023-10-15T10-49-53.793437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.008179530201342282, "em_stderr": 0.0009224022743034369, "f1": 0.06148594798657739, "f1_stderr": 0.0015819609377213373, "acc": 0.35462516448027825, "acc_stderr": 0.008753822199298638 }, "harness|drop|3": { "em": 0.008179530201342282, "em_stderr": 0.0009224022743034369, "f1": 0.06148594798657739, "f1_stderr": 0.0015819609377213373 }, "harness|gsm8k|5": { "acc": 0.026535253980288095, "acc_stderr": 0.0044270459872651595 }, "harness|winogrande|5": { "acc": 0.6827150749802684, "acc_stderr": 0.013080598411332115 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__gpt-neox-20b-full-precision
[ "region:us" ]
2023-08-18T10:37:13+00:00
{"pretty_name": "Evaluation run of dvruette/gpt-neox-20b-full-precision", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/gpt-neox-20b-full-precision](https://huggingface.co/dvruette/gpt-neox-20b-full-precision) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__gpt-neox-20b-full-precision\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T10:49:53.793437](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__gpt-neox-20b-full-precision/blob/main/results_2023-10-15T10-49-53.793437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008179530201342282,\n \"em_stderr\": 0.0009224022743034369,\n \"f1\": 0.06148594798657739,\n \"f1_stderr\": 0.0015819609377213373,\n \"acc\": 0.35462516448027825,\n \"acc_stderr\": 0.008753822199298638\n },\n \"harness|drop|3\": {\n \"em\": 0.008179530201342282,\n \"em_stderr\": 0.0009224022743034369,\n \"f1\": 0.06148594798657739,\n \"f1_stderr\": 0.0015819609377213373\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \"acc_stderr\": 0.0044270459872651595\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6827150749802684,\n \"acc_stderr\": 0.013080598411332115\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/gpt-neox-20b-full-precision", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T10_49_53.793437", "path": ["**/details_harness|drop|3_2023-10-15T10-49-53.793437.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T10-49-53.793437.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T10_49_53.793437", "path": ["**/details_harness|gsm8k|5_2023-10-15T10-49-53.793437.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T10-49-53.793437.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:36:17.720122.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:36:17.720122.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:36:17.720122.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T10_49_53.793437", "path": ["**/details_harness|winogrande|5_2023-10-15T10-49-53.793437.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T10-49-53.793437.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T21_36_17.720122", "path": ["results_2023-07-19T21:36:17.720122.parquet"]}, {"split": "2023_10_15T10_49_53.793437", "path": ["results_2023-10-15T10-49-53.793437.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T10-49-53.793437.parquet"]}]}]}
2023-10-15T09:50:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/gpt-neox-20b-full-precision ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/gpt-neox-20b-full-precision on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T10:49:53.793437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/gpt-neox-20b-full-precision", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/gpt-neox-20b-full-precision on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T10:49:53.793437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/gpt-neox-20b-full-precision", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/gpt-neox-20b-full-precision on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T10:49:53.793437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/gpt-neox-20b-full-precision## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/gpt-neox-20b-full-precision on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T10:49:53.793437(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2b56d218431fafb3f723d9964d318374046a15de
# Dataset Card for Evaluation run of dvruette/oasst-llama-13b-1000-steps ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/oasst-llama-13b-1000-steps - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/oasst-llama-13b-1000-steps](https://huggingface.co/dvruette/oasst-llama-13b-1000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-19T08:21:45.540153](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps/blob/main/results_2023-10-19T08-21-45.540153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0959521812080537, "em_stderr": 0.0030162183550142383, "f1": 0.16973573825503283, "f1_stderr": 0.003251453767412336, "acc": 0.44401178094667637, "acc_stderr": 0.010227191296479903 }, "harness|drop|3": { "em": 0.0959521812080537, "em_stderr": 0.0030162183550142383, "f1": 0.16973573825503283, "f1_stderr": 0.003251453767412336 }, "harness|gsm8k|5": { "acc": 0.11296436694465505, "acc_stderr": 0.008719339028833073 }, "harness|winogrande|5": { "acc": 0.7750591949486977, "acc_stderr": 0.011735043564126735 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps
[ "region:us" ]
2023-08-18T10:37:22+00:00
{"pretty_name": "Evaluation run of dvruette/oasst-llama-13b-1000-steps", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/oasst-llama-13b-1000-steps](https://huggingface.co/dvruette/oasst-llama-13b-1000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T08:21:45.540153](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-llama-13b-1000-steps/blob/main/results_2023-10-19T08-21-45.540153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0959521812080537,\n \"em_stderr\": 0.0030162183550142383,\n \"f1\": 0.16973573825503283,\n \"f1_stderr\": 0.003251453767412336,\n \"acc\": 0.44401178094667637,\n \"acc_stderr\": 0.010227191296479903\n },\n \"harness|drop|3\": {\n \"em\": 0.0959521812080537,\n \"em_stderr\": 0.0030162183550142383,\n \"f1\": 0.16973573825503283,\n \"f1_stderr\": 0.003251453767412336\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \"acc_stderr\": 0.008719339028833073\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/oasst-llama-13b-1000-steps", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T08_21_45.540153", "path": ["**/details_harness|drop|3_2023-10-19T08-21-45.540153.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T08-21-45.540153.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T08_21_45.540153", "path": ["**/details_harness|gsm8k|5_2023-10-19T08-21-45.540153.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T08-21-45.540153.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:56.824224.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:56.824224.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T08_21_45.540153", "path": ["**/details_harness|winogrande|5_2023-10-19T08-21-45.540153.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T08-21-45.540153.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_48_56.824224", "path": ["results_2023-07-19T18:48:56.824224.parquet"]}, {"split": "2023_10_19T08_21_45.540153", "path": ["results_2023-10-19T08-21-45.540153.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T08-21-45.540153.parquet"]}]}]}
2023-10-19T07:21:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/oasst-llama-13b-1000-steps ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/oasst-llama-13b-1000-steps on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-19T08:21:45.540153(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/oasst-llama-13b-1000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-llama-13b-1000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T08:21:45.540153(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/oasst-llama-13b-1000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-llama-13b-1000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T08:21:45.540153(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/oasst-llama-13b-1000-steps## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-llama-13b-1000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T08:21:45.540153(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
358e52b6267b97e889fedc206d434c57fae40e22
# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-1000-steps ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dvruette/oasst-gpt-neox-20b-1000-steps - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dvruette/oasst-gpt-neox-20b-1000-steps](https://huggingface.co/dvruette/oasst-gpt-neox-20b-1000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-1000-steps", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T09:20:16.828896](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-1000-steps/blob/main/results_2023-10-18T09-20-16.828896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.014052013422818792, "em_stderr": 0.0012054131682959785, "f1": 0.06994756711409417, "f1_stderr": 0.0017865342651761876, "acc": 0.34940158798640236, "acc_stderr": 0.009009306636314949 }, "harness|drop|3": { "em": 0.014052013422818792, "em_stderr": 0.0012054131682959785, "f1": 0.06994756711409417, "f1_stderr": 0.0017865342651761876 }, "harness|gsm8k|5": { "acc": 0.0310841546626232, "acc_stderr": 0.00478029671839337 }, "harness|winogrande|5": { "acc": 0.6677190213101816, "acc_stderr": 0.013238316554236526 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-1000-steps
[ "region:us" ]
2023-08-18T10:37:39+00:00
{"pretty_name": "Evaluation run of dvruette/oasst-gpt-neox-20b-1000-steps", "dataset_summary": "Dataset automatically created during the evaluation run of model [dvruette/oasst-gpt-neox-20b-1000-steps](https://huggingface.co/dvruette/oasst-gpt-neox-20b-1000-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-1000-steps\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T09:20:16.828896](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-gpt-neox-20b-1000-steps/blob/main/results_2023-10-18T09-20-16.828896.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014052013422818792,\n \"em_stderr\": 0.0012054131682959785,\n \"f1\": 0.06994756711409417,\n \"f1_stderr\": 0.0017865342651761876,\n \"acc\": 0.34940158798640236,\n \"acc_stderr\": 0.009009306636314949\n },\n \"harness|drop|3\": {\n \"em\": 0.014052013422818792,\n \"em_stderr\": 0.0012054131682959785,\n \"f1\": 0.06994756711409417,\n \"f1_stderr\": 0.0017865342651761876\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0310841546626232,\n \"acc_stderr\": 0.00478029671839337\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6677190213101816,\n \"acc_stderr\": 0.013238316554236526\n }\n}\n```", "repo_url": "https://huggingface.co/dvruette/oasst-gpt-neox-20b-1000-steps", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T09_20_16.828896", "path": ["**/details_harness|drop|3_2023-10-18T09-20-16.828896.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T09-20-16.828896.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T09_20_16.828896", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-20-16.828896.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-20-16.828896.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:57.909162.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:43:57.909162.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:43:57.909162.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T09_20_16.828896", "path": ["**/details_harness|winogrande|5_2023-10-18T09-20-16.828896.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T09-20-16.828896.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T21_43_57.909162", "path": ["results_2023-07-19T21:43:57.909162.parquet"]}, {"split": "2023_10_18T09_20_16.828896", "path": ["results_2023-10-18T09-20-16.828896.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T09-20-16.828896.parquet"]}]}]}
2023-10-18T08:20:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-1000-steps ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-1000-steps on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T09:20:16.828896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-1000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-1000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T09:20:16.828896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-1000-steps", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-1000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T09:20:16.828896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dvruette/oasst-gpt-neox-20b-1000-steps## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dvruette/oasst-gpt-neox-20b-1000-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T09:20:16.828896(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ab2a877d8658a17d54565c2544fd9f6c1ccda2c5
# Dataset Card for Evaluation run of yeontaek/llama-2-13b-Guanaco-QLoRA ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/yeontaek/llama-2-13b-Guanaco-QLoRA - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13b-Guanaco-QLoRA](https://huggingface.co/yeontaek/llama-2-13b-Guanaco-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-13b-Guanaco-QLoRA", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T09:19:25.361261](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13b-Guanaco-QLoRA/blob/main/results_2023-10-22T09-19-25.361261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0018875838926174498, "em_stderr": 0.00044451099905589624, "f1": 0.06432256711409379, "f1_stderr": 0.0013911197722481863, "acc": 0.44091694875395904, "acc_stderr": 0.010204605702764498 }, "harness|drop|3": { "em": 0.0018875838926174498, "em_stderr": 0.00044451099905589624, "f1": 0.06432256711409379, "f1_stderr": 0.0013911197722481863 }, "harness|gsm8k|5": { "acc": 0.10993176648976498, "acc_stderr": 0.008616195587865406 }, "harness|winogrande|5": { "acc": 0.7719021310181531, "acc_stderr": 0.011793015817663592 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_yeontaek__llama-2-13b-Guanaco-QLoRA
[ "region:us" ]
2023-08-18T10:37:49+00:00
{"pretty_name": "Evaluation run of yeontaek/llama-2-13b-Guanaco-QLoRA", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13b-Guanaco-QLoRA](https://huggingface.co/yeontaek/llama-2-13b-Guanaco-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-13b-Guanaco-QLoRA\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T09:19:25.361261](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13b-Guanaco-QLoRA/blob/main/results_2023-10-22T09-19-25.361261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589624,\n \"f1\": 0.06432256711409379,\n \"f1_stderr\": 0.0013911197722481863,\n \"acc\": 0.44091694875395904,\n \"acc_stderr\": 0.010204605702764498\n },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589624,\n \"f1\": 0.06432256711409379,\n \"f1_stderr\": 0.0013911197722481863\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10993176648976498,\n \"acc_stderr\": 0.008616195587865406\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/llama-2-13b-Guanaco-QLoRA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T09_19_25.361261", "path": ["**/details_harness|drop|3_2023-10-22T09-19-25.361261.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T09-19-25.361261.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T09_19_25.361261", "path": ["**/details_harness|gsm8k|5_2023-10-22T09-19-25.361261.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T09-19-25.361261.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:23:10.081119.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:23:10.081119.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:23:10.081119.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T09_19_25.361261", "path": ["**/details_harness|winogrande|5_2023-10-22T09-19-25.361261.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T09-19-25.361261.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T21_23_10.081119", "path": ["results_2023-08-09T21:23:10.081119.parquet"]}, {"split": "2023_10_22T09_19_25.361261", "path": ["results_2023-10-22T09-19-25.361261.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T09-19-25.361261.parquet"]}]}]}
2023-10-22T08:19:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yeontaek/llama-2-13b-Guanaco-QLoRA ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model yeontaek/llama-2-13b-Guanaco-QLoRA on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T09:19:25.361261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of yeontaek/llama-2-13b-Guanaco-QLoRA", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-13b-Guanaco-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T09:19:25.361261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yeontaek/llama-2-13b-Guanaco-QLoRA", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-13b-Guanaco-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T09:19:25.361261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/llama-2-13b-Guanaco-QLoRA## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-13b-Guanaco-QLoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T09:19:25.361261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
680c7e5481466744b0ac93f5e194cbc9bf9c7da5
# Dataset Card for Evaluation run of chaoyi-wu/MedLLaMA_13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chaoyi-wu/MedLLaMA_13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [chaoyi-wu/MedLLaMA_13B](https://huggingface.co/chaoyi-wu/MedLLaMA_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chaoyi-wu__MedLLaMA_13B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-24T13:04:01.266274](https://huggingface.co/datasets/open-llm-leaderboard/details_chaoyi-wu__MedLLaMA_13B/blob/main/results_2023-07-24T13%3A04%3A01.266274.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.46685175478824187, "acc_stderr": 0.03531409019484935, "acc_norm": 0.47077526563025673, "acc_norm_stderr": 0.035299387024960424, "mc1": 0.2582619339045288, "mc1_stderr": 0.0153218216884762, "mc2": 0.4053787386286284, "mc2_stderr": 0.013893490031868357 }, "harness|arc:challenge|25": { "acc": 0.5102389078498294, "acc_stderr": 0.014608326906285012, "acc_norm": 0.5426621160409556, "acc_norm_stderr": 0.014558106543924065 }, "harness|hellaswag|10": { "acc": 0.5862378012348137, "acc_stderr": 0.004915003499517829, "acc_norm": 0.7853017327225652, "acc_norm_stderr": 0.004097736838432052 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5259259259259259, "acc_stderr": 0.04313531696750575, "acc_norm": 0.5259259259259259, "acc_norm_stderr": 0.04313531696750575 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.48026315789473684, "acc_stderr": 0.040657710025626036, "acc_norm": 0.48026315789473684, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.49056603773584906, "acc_stderr": 0.0307673947078081, "acc_norm": 0.49056603773584906, "acc_norm_stderr": 0.0307673947078081 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4791666666666667, "acc_stderr": 0.041775789507399935, "acc_norm": 0.4791666666666667, "acc_norm_stderr": 0.041775789507399935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.42196531791907516, "acc_stderr": 0.03765746693865151, "acc_norm": 0.42196531791907516, "acc_norm_stderr": 0.03765746693865151 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237657, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237657 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4, "acc_stderr": 0.03202563076101737, "acc_norm": 0.4, "acc_norm_stderr": 0.03202563076101737 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03947152782669415, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03947152782669415 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3793103448275862, "acc_stderr": 0.04043461861916747, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.04043461861916747 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.23809523809523808, "acc_stderr": 0.021935878081184766, "acc_norm": 0.23809523809523808, "acc_norm_stderr": 0.021935878081184766 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04216370213557835, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04216370213557835 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5129032258064516, "acc_stderr": 0.028434533152681855, "acc_norm": 0.5129032258064516, "acc_norm_stderr": 0.028434533152681855 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.28078817733990147, "acc_stderr": 0.0316185633535861, "acc_norm": 0.28078817733990147, "acc_norm_stderr": 0.0316185633535861 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5757575757575758, "acc_stderr": 0.038592681420702636, "acc_norm": 0.5757575757575758, "acc_norm_stderr": 0.038592681420702636 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5151515151515151, "acc_stderr": 0.03560716516531061, "acc_norm": 0.5151515151515151, "acc_norm_stderr": 0.03560716516531061 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6580310880829016, "acc_stderr": 0.03423465100104283, "acc_norm": 0.6580310880829016, "acc_norm_stderr": 0.03423465100104283 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.43846153846153846, "acc_stderr": 0.025158266016868575, "acc_norm": 0.43846153846153846, "acc_norm_stderr": 0.025158266016868575 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.027840811495871927, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.027840811495871927 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.44537815126050423, "acc_stderr": 0.0322841062671639, "acc_norm": 0.44537815126050423, "acc_norm_stderr": 0.0322841062671639 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5871559633027523, "acc_stderr": 0.021109128133413913, "acc_norm": 0.5871559633027523, "acc_norm_stderr": 0.021109128133413913 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3472222222222222, "acc_stderr": 0.032468872436376486, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.032468872436376486 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5294117647058824, "acc_stderr": 0.03503235296367992, "acc_norm": 0.5294117647058824, "acc_norm_stderr": 0.03503235296367992 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6244725738396625, "acc_stderr": 0.03152256243091156, "acc_norm": 0.6244725738396625, "acc_norm_stderr": 0.03152256243091156 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5291479820627802, "acc_stderr": 0.03350073248773404, "acc_norm": 0.5291479820627802, "acc_norm_stderr": 0.03350073248773404 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5343511450381679, "acc_stderr": 0.043749285605997376, "acc_norm": 0.5343511450381679, "acc_norm_stderr": 0.043749285605997376 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6528925619834711, "acc_stderr": 0.04345724570292534, "acc_norm": 0.6528925619834711, "acc_norm_stderr": 0.04345724570292534 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.49074074074074076, "acc_stderr": 0.04832853553437055, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.04832853553437055 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4294478527607362, "acc_stderr": 0.03889066619112722, "acc_norm": 0.4294478527607362, "acc_norm_stderr": 0.03889066619112722 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.045723723587374296, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.045723723587374296 }, "harness|hendrycksTest-management|5": { "acc": 0.5922330097087378, "acc_stderr": 0.0486577757041077, "acc_norm": 0.5922330097087378, "acc_norm_stderr": 0.0486577757041077 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6495726495726496, "acc_stderr": 0.0312561082442188, "acc_norm": 0.6495726495726496, "acc_norm_stderr": 0.0312561082442188 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6206896551724138, "acc_stderr": 0.01735126811754445, "acc_norm": 0.6206896551724138, "acc_norm_stderr": 0.01735126811754445 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5028901734104047, "acc_stderr": 0.02691864538323901, "acc_norm": 0.5028901734104047, "acc_norm_stderr": 0.02691864538323901 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2558659217877095, "acc_stderr": 0.014593620923210756, "acc_norm": 0.2558659217877095, "acc_norm_stderr": 0.014593620923210756 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.545751633986928, "acc_stderr": 0.028509807802626592, "acc_norm": 0.545751633986928, "acc_norm_stderr": 0.028509807802626592 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.49517684887459806, "acc_stderr": 0.028396770444111298, "acc_norm": 0.49517684887459806, "acc_norm_stderr": 0.028396770444111298 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5030864197530864, "acc_stderr": 0.027820214158594377, "acc_norm": 0.5030864197530864, "acc_norm_stderr": 0.027820214158594377 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3546099290780142, "acc_stderr": 0.028538650028878638, "acc_norm": 0.3546099290780142, "acc_norm_stderr": 0.028538650028878638 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3324641460234681, "acc_stderr": 0.01203202233226051, "acc_norm": 0.3324641460234681, "acc_norm_stderr": 0.01203202233226051 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5257352941176471, "acc_stderr": 0.03033257809455502, "acc_norm": 0.5257352941176471, "acc_norm_stderr": 0.03033257809455502 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.46895424836601307, "acc_stderr": 0.020188804456361887, "acc_norm": 0.46895424836601307, "acc_norm_stderr": 0.020188804456361887 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5636363636363636, "acc_stderr": 0.04750185058907296, "acc_norm": 0.5636363636363636, "acc_norm_stderr": 0.04750185058907296 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5387755102040817, "acc_stderr": 0.031912820526692774, "acc_norm": 0.5387755102040817, "acc_norm_stderr": 0.031912820526692774 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6318407960199005, "acc_stderr": 0.03410410565495302, "acc_norm": 0.6318407960199005, "acc_norm_stderr": 0.03410410565495302 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6549707602339181, "acc_stderr": 0.03645981377388806, "acc_norm": 0.6549707602339181, "acc_norm_stderr": 0.03645981377388806 }, "harness|truthfulqa:mc|0": { "mc1": 0.2582619339045288, "mc1_stderr": 0.0153218216884762, "mc2": 0.4053787386286284, "mc2_stderr": 0.013893490031868357 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_chaoyi-wu__MedLLaMA_13B
[ "region:us" ]
2023-08-18T10:37:59+00:00
{"pretty_name": "Evaluation run of chaoyi-wu/MedLLaMA_13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [chaoyi-wu/MedLLaMA_13B](https://huggingface.co/chaoyi-wu/MedLLaMA_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chaoyi-wu__MedLLaMA_13B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-24T13:04:01.266274](https://huggingface.co/datasets/open-llm-leaderboard/details_chaoyi-wu__MedLLaMA_13B/blob/main/results_2023-07-24T13%3A04%3A01.266274.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46685175478824187,\n \"acc_stderr\": 0.03531409019484935,\n \"acc_norm\": 0.47077526563025673,\n \"acc_norm_stderr\": 0.035299387024960424,\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.4053787386286284,\n \"mc2_stderr\": 0.013893490031868357\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285012,\n \"acc_norm\": 0.5426621160409556,\n \"acc_norm_stderr\": 0.014558106543924065\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5862378012348137,\n \"acc_stderr\": 0.004915003499517829,\n \"acc_norm\": 0.7853017327225652,\n \"acc_norm_stderr\": 0.004097736838432052\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.03765746693865151,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.03765746693865151\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237657,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237657\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101737,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.04043461861916747,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.04043461861916747\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184766,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184766\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5129032258064516,\n \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.5129032258064516,\n \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.038592681420702636,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.038592681420702636\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104283,\n \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104283\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.025158266016868575,\n \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.025158266016868575\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5871559633027523,\n \"acc_stderr\": 0.021109128133413913,\n \"acc_norm\": 0.5871559633027523,\n \"acc_norm_stderr\": 0.021109128133413913\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367992,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367992\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.03889066619112722,\n \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.03889066619112722\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.0486577757041077,\n \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.0486577757041077\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n \"acc_stderr\": 0.0312561082442188,\n \"acc_norm\": 0.6495726495726496,\n \"acc_norm_stderr\": 0.0312561082442188\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.01735126811754445,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.01735126811754445\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.02691864538323901,\n \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.02691864538323901\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n \"acc_stderr\": 0.014593620923210756,\n \"acc_norm\": 0.2558659217877095,\n \"acc_norm_stderr\": 0.014593620923210756\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626592,\n \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626592\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.49517684887459806,\n \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.027820214158594377,\n \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.027820214158594377\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878638,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3324641460234681,\n \"acc_stderr\": 0.01203202233226051,\n \"acc_norm\": 0.3324641460234681,\n \"acc_norm_stderr\": 0.01203202233226051\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455502,\n \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455502\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.46895424836601307,\n \"acc_stderr\": 0.020188804456361887,\n \"acc_norm\": 0.46895424836601307,\n \"acc_norm_stderr\": 0.020188804456361887\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n \"acc_stderr\": 0.03410410565495302,\n \"acc_norm\": 0.6318407960199005,\n \"acc_norm_stderr\": 0.03410410565495302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.03645981377388806,\n \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.03645981377388806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.4053787386286284,\n \"mc2_stderr\": 0.013893490031868357\n }\n}\n```", "repo_url": "https://huggingface.co/chaoyi-wu/MedLLaMA_13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:04:01.266274.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:04:01.266274.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T13_04_01.266274", "path": ["results_2023-07-24T13:04:01.266274.parquet"]}, {"split": "latest", "path": ["results_2023-07-24T13:04:01.266274.parquet"]}]}]}
2023-08-27T11:35:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of chaoyi-wu/MedLLaMA_13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model chaoyi-wu/MedLLaMA_13B on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-24T13:04:01.266274 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of chaoyi-wu/MedLLaMA_13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chaoyi-wu/MedLLaMA_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T13:04:01.266274 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of chaoyi-wu/MedLLaMA_13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chaoyi-wu/MedLLaMA_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T13:04:01.266274 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chaoyi-wu/MedLLaMA_13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model chaoyi-wu/MedLLaMA_13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-24T13:04:01.266274 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
07afbdde632f94e01546a5bdae7c93a51fd71bca
# Dataset Card for Evaluation run of lxe/Cerebras-GPT-2.7B-Alpaca-SP ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/lxe/Cerebras-GPT-2.7B-Alpaca-SP - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [lxe/Cerebras-GPT-2.7B-Alpaca-SP](https://huggingface.co/lxe/Cerebras-GPT-2.7B-Alpaca-SP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_lxe__Cerebras-GPT-2.7B-Alpaca-SP", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T18:18:30.212665](https://huggingface.co/datasets/open-llm-leaderboard/details_lxe__Cerebras-GPT-2.7B-Alpaca-SP/blob/main/results_2023-09-17T18-18-30.212665.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968570946, "f1": 0.047793624161073994, "f1_stderr": 0.0012393218832561278, "acc": 0.2796858853033169, "acc_stderr": 0.007985699601639381 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968570946, "f1": 0.047793624161073994, "f1_stderr": 0.0012393218832561278 }, "harness|gsm8k|5": { "acc": 0.00530705079605762, "acc_stderr": 0.0020013057209480587 }, "harness|winogrande|5": { "acc": 0.5540647198105761, "acc_stderr": 0.013970093482330704 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_lxe__Cerebras-GPT-2.7B-Alpaca-SP
[ "region:us" ]
2023-08-18T10:38:12+00:00
{"pretty_name": "Evaluation run of lxe/Cerebras-GPT-2.7B-Alpaca-SP", "dataset_summary": "Dataset automatically created during the evaluation run of model [lxe/Cerebras-GPT-2.7B-Alpaca-SP](https://huggingface.co/lxe/Cerebras-GPT-2.7B-Alpaca-SP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lxe__Cerebras-GPT-2.7B-Alpaca-SP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T18:18:30.212665](https://huggingface.co/datasets/open-llm-leaderboard/details_lxe__Cerebras-GPT-2.7B-Alpaca-SP/blob/main/results_2023-09-17T18-18-30.212665.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968570946,\n \"f1\": 0.047793624161073994,\n \"f1_stderr\": 0.0012393218832561278,\n \"acc\": 0.2796858853033169,\n \"acc_stderr\": 0.007985699601639381\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968570946,\n \"f1\": 0.047793624161073994,\n \"f1_stderr\": 0.0012393218832561278\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.0020013057209480587\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5540647198105761,\n \"acc_stderr\": 0.013970093482330704\n }\n}\n```", "repo_url": "https://huggingface.co/lxe/Cerebras-GPT-2.7B-Alpaca-SP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T18_18_30.212665", "path": ["**/details_harness|drop|3_2023-09-17T18-18-30.212665.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T18-18-30.212665.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T18_18_30.212665", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-18-30.212665.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-18-30.212665.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:29:28.881974.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:29:28.881974.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:29:28.881974.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T18_18_30.212665", "path": ["**/details_harness|winogrande|5_2023-09-17T18-18-30.212665.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T18-18-30.212665.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_29_28.881974", "path": ["results_2023-07-18T11:29:28.881974.parquet"]}, {"split": "2023_09_17T18_18_30.212665", "path": ["results_2023-09-17T18-18-30.212665.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T18-18-30.212665.parquet"]}]}]}
2023-09-17T17:18:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of lxe/Cerebras-GPT-2.7B-Alpaca-SP ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model lxe/Cerebras-GPT-2.7B-Alpaca-SP on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T18:18:30.212665(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of lxe/Cerebras-GPT-2.7B-Alpaca-SP", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lxe/Cerebras-GPT-2.7B-Alpaca-SP on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T18:18:30.212665(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of lxe/Cerebras-GPT-2.7B-Alpaca-SP", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lxe/Cerebras-GPT-2.7B-Alpaca-SP on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T18:18:30.212665(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 68, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lxe/Cerebras-GPT-2.7B-Alpaca-SP## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lxe/Cerebras-GPT-2.7B-Alpaca-SP on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T18:18:30.212665(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ceda35e21afb4ab3517ae3eb689044ed781becc3
# Dataset Card for Evaluation run of anhnv125/pygmalion-6b-roleplay ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/anhnv125/pygmalion-6b-roleplay - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [anhnv125/pygmalion-6b-roleplay](https://huggingface.co/anhnv125/pygmalion-6b-roleplay) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_anhnv125__pygmalion-6b-roleplay", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T05:57:52.375499](https://huggingface.co/datasets/open-llm-leaderboard/details_anhnv125__pygmalion-6b-roleplay/blob/main/results_2023-09-17T05-57-52.375499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054709947, "f1": 0.05561765939597344, "f1_stderr": 0.0013547337231371388, "acc": 0.3190247209594698, "acc_stderr": 0.008257334480912119 }, "harness|drop|3": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054709947, "f1": 0.05561765939597344, "f1_stderr": 0.0013547337231371388 }, "harness|gsm8k|5": { "acc": 0.011372251705837756, "acc_stderr": 0.002920666198788722 }, "harness|winogrande|5": { "acc": 0.6266771902131019, "acc_stderr": 0.013594002763035516 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_anhnv125__pygmalion-6b-roleplay
[ "region:us" ]
2023-08-18T10:38:24+00:00
{"pretty_name": "Evaluation run of anhnv125/pygmalion-6b-roleplay", "dataset_summary": "Dataset automatically created during the evaluation run of model [anhnv125/pygmalion-6b-roleplay](https://huggingface.co/anhnv125/pygmalion-6b-roleplay) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_anhnv125__pygmalion-6b-roleplay\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T05:57:52.375499](https://huggingface.co/datasets/open-llm-leaderboard/details_anhnv125__pygmalion-6b-roleplay/blob/main/results_2023-09-17T05-57-52.375499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054709947,\n \"f1\": 0.05561765939597344,\n \"f1_stderr\": 0.0013547337231371388,\n \"acc\": 0.3190247209594698,\n \"acc_stderr\": 0.008257334480912119\n },\n \"harness|drop|3\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054709947,\n \"f1\": 0.05561765939597344,\n \"f1_stderr\": 0.0013547337231371388\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.002920666198788722\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6266771902131019,\n \"acc_stderr\": 0.013594002763035516\n }\n}\n```", "repo_url": "https://huggingface.co/anhnv125/pygmalion-6b-roleplay", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T05_57_52.375499", "path": ["**/details_harness|drop|3_2023-09-17T05-57-52.375499.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T05-57-52.375499.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T05_57_52.375499", "path": ["**/details_harness|gsm8k|5_2023-09-17T05-57-52.375499.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T05-57-52.375499.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:17:43.702617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:17:43.702617.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:17:43.702617.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T05_57_52.375499", "path": ["**/details_harness|winogrande|5_2023-09-17T05-57-52.375499.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T05-57-52.375499.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T14_17_43.702617", "path": ["results_2023-08-01T14:17:43.702617.parquet"]}, {"split": "2023_09_17T05_57_52.375499", "path": ["results_2023-09-17T05-57-52.375499.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T05-57-52.375499.parquet"]}]}]}
2023-09-17T04:58:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of anhnv125/pygmalion-6b-roleplay ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model anhnv125/pygmalion-6b-roleplay on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T05:57:52.375499(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of anhnv125/pygmalion-6b-roleplay", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model anhnv125/pygmalion-6b-roleplay on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T05:57:52.375499(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of anhnv125/pygmalion-6b-roleplay", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model anhnv125/pygmalion-6b-roleplay on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T05:57:52.375499(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of anhnv125/pygmalion-6b-roleplay## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model anhnv125/pygmalion-6b-roleplay on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T05:57:52.375499(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4dfddd3bfd2478e79a6418c45be6f78ce1f54359
# Dataset Card for Evaluation run of GigaML/X1-large ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/GigaML/X1-large - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [GigaML/X1-large](https://huggingface.co/GigaML/X1-large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_GigaML__X1-large", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-17T02:33:44.123886](https://huggingface.co/datasets/open-llm-leaderboard/details_GigaML__X1-large/blob/main/results_2023-08-17T02%3A33%3A44.123886.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2314240573187148, "acc_stderr": 0.03071122006512167, "acc_norm": 0.2314240573187148, "acc_norm_stderr": 0.03071122006512167, "mc1": 1.0, "mc1_stderr": 0.0, "mc2": NaN, "mc2_stderr": NaN }, "harness|arc:challenge|25": { "acc": 0.22696245733788395, "acc_stderr": 0.012240491536132861, "acc_norm": 0.22696245733788395, "acc_norm_stderr": 0.012240491536132861 }, "harness|hellaswag|10": { "acc": 0.2504481179047998, "acc_stderr": 0.004323856300539177, "acc_norm": 0.2504481179047998, "acc_norm_stderr": 0.004323856300539177 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 1.0, "mc1_stderr": 0.0, "mc2": NaN, "mc2_stderr": NaN } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_GigaML__X1-large
[ "region:us" ]
2023-08-18T10:39:13+00:00
{"pretty_name": "Evaluation run of GigaML/X1-large", "dataset_summary": "Dataset automatically created during the evaluation run of model [GigaML/X1-large](https://huggingface.co/GigaML/X1-large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GigaML__X1-large\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-17T02:33:44.123886](https://huggingface.co/datasets/open-llm-leaderboard/details_GigaML__X1-large/blob/main/results_2023-08-17T02%3A33%3A44.123886.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2314240573187148,\n \"acc_stderr\": 0.03071122006512167,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n }\n}\n```", "repo_url": "https://huggingface.co/GigaML/X1-large", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|arc:challenge|25_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hellaswag|10_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T02:33:44.123886.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T02:33:44.123886.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T02_33_44.123886", "path": ["results_2023-08-17T02:33:44.123886.parquet"]}, {"split": "latest", "path": ["results_2023-08-17T02:33:44.123886.parquet"]}]}]}
2023-08-27T11:35:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of GigaML/X1-large ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model GigaML/X1-large on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-17T02:33:44.123886 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of GigaML/X1-large", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GigaML/X1-large on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T02:33:44.123886 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of GigaML/X1-large", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GigaML/X1-large on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T02:33:44.123886 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of GigaML/X1-large## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model GigaML/X1-large on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-17T02:33:44.123886 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d319ecd1bd327e12aca84b8aef5eac8e8a4427fc
# Dataset Card for Evaluation run of Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload](https://huggingface.co/Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__WizardVicuna-Uncensored-3B-instruct-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T08:35:07.670850](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__WizardVicuna-Uncensored-3B-instruct-PL-lora_unload/blob/main/results_2023-09-23T08-35-07.670850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0008389261744966443, "em_stderr": 0.0002964962989801231, "f1": 0.04817009228187932, "f1_stderr": 0.0012423575338082733, "acc": 0.3277999943752083, "acc_stderr": 0.007841759236399801 }, "harness|drop|3": { "em": 0.0008389261744966443, "em_stderr": 0.0002964962989801231, "f1": 0.04817009228187932, "f1_stderr": 0.0012423575338082733 }, "harness|gsm8k|5": { "acc": 0.006823351023502654, "acc_stderr": 0.0022675371022544753 }, "harness|winogrande|5": { "acc": 0.648776637726914, "acc_stderr": 0.013415981370545126 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__WizardVicuna-Uncensored-3B-instruct-PL-lora_unload
[ "region:us" ]
2023-08-18T10:39:23+00:00
{"pretty_name": "Evaluation run of Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload](https://huggingface.co/Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__WizardVicuna-Uncensored-3B-instruct-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T08:35:07.670850](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__WizardVicuna-Uncensored-3B-instruct-PL-lora_unload/blob/main/results_2023-09-23T08-35-07.670850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801231,\n \"f1\": 0.04817009228187932,\n \"f1_stderr\": 0.0012423575338082733,\n \"acc\": 0.3277999943752083,\n \"acc_stderr\": 0.007841759236399801\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801231,\n \"f1\": 0.04817009228187932,\n \"f1_stderr\": 0.0012423575338082733\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022544753\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.648776637726914,\n \"acc_stderr\": 0.013415981370545126\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T08_35_07.670850", "path": ["**/details_harness|drop|3_2023-09-23T08-35-07.670850.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T08-35-07.670850.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T08_35_07.670850", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-35-07.670850.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-35-07.670850.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:17.231862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:19:17.231862.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:19:17.231862.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T08_35_07.670850", "path": ["**/details_harness|winogrande|5_2023-09-23T08-35-07.670850.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T08-35-07.670850.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T14_19_17.231862", "path": ["results_2023-08-09T14:19:17.231862.parquet"]}, {"split": "2023_09_23T08_35_07.670850", "path": ["results_2023-09-23T08-35-07.670850.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T08-35-07.670850.parquet"]}]}]}
2023-09-23T07:35:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T08:35:07.670850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T08:35:07.670850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T08:35:07.670850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 36, 31, 184, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/WizardVicuna-Uncensored-3B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T08:35:07.670850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
073b9f1c7c97e4bf1f23b1176d010f30eeecd088
# Dataset Card for Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/llama-30b-2048-instruct-PL-lora_unload](https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T16:55:26.750337](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload/blob/main/results_2023-09-23T16-55-26.750337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.006396812080536913, "em_stderr": 0.0008164468837432337, "f1": 0.09082529362416124, "f1_stderr": 0.00181131297042163, "acc": 0.48843566764183, "acc_stderr": 0.010921337573474368 }, "harness|drop|3": { "em": 0.006396812080536913, "em_stderr": 0.0008164468837432337, "f1": 0.09082529362416124, "f1_stderr": 0.00181131297042163 }, "harness|gsm8k|5": { "acc": 0.17892342683851403, "acc_stderr": 0.010557661392901289 }, "harness|winogrande|5": { "acc": 0.797947908445146, "acc_stderr": 0.011285013754047448 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload
[ "region:us" ]
2023-08-18T10:39:50+00:00
{"pretty_name": "Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/llama-30b-2048-instruct-PL-lora_unload](https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T16:55:26.750337](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload/blob/main/results_2023-09-23T16-55-26.750337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432337,\n \"f1\": 0.09082529362416124,\n \"f1_stderr\": 0.00181131297042163,\n \"acc\": 0.48843566764183,\n \"acc_stderr\": 0.010921337573474368\n },\n \"harness|drop|3\": {\n \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432337,\n \"f1\": 0.09082529362416124,\n \"f1_stderr\": 0.00181131297042163\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17892342683851403,\n \"acc_stderr\": 0.010557661392901289\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047448\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T16_55_26.750337", "path": ["**/details_harness|drop|3_2023-09-23T16-55-26.750337.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T16-55-26.750337.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T16_55_26.750337", "path": ["**/details_harness|gsm8k|5_2023-09-23T16-55-26.750337.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T16-55-26.750337.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:59:52.848491.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:59:52.848491.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T16_55_26.750337", "path": ["**/details_harness|winogrande|5_2023-09-23T16-55-26.750337.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T16-55-26.750337.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T15_59_52.848491", "path": ["results_2023-08-09T15:59:52.848491.parquet"]}, {"split": "2023_09_23T16_55_26.750337", "path": ["results_2023-09-23T16-55-26.750337.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T16-55-26.750337.parquet"]}]}]}
2023-09-23T15:55:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/llama-30b-2048-instruct-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T16:55:26.750337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/llama-30b-2048-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T16:55:26.750337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/llama-30b-2048-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T16:55:26.750337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/llama-30b-2048-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T16:55:26.750337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b1e31ec88eff17f4128ff7645611dedf40ead82e
# Dataset Card for Evaluation run of Aspik101/StableBeluga-13B-instruct-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/StableBeluga-13B-instruct-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/StableBeluga-13B-instruct-PL-lora_unload](https://huggingface.co/Aspik101/StableBeluga-13B-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__StableBeluga-13B-instruct-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T13:27:06.134462](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__StableBeluga-13B-instruct-PL-lora_unload/blob/main/results_2023-10-13T13-27-06.134462.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.014786073825503355, "em_stderr": 0.0012360366760472946, "f1": 0.08531564597315425, "f1_stderr": 0.001909056545524939, "acc": 0.44382508573319457, "acc_stderr": 0.010461592536002241 }, "harness|drop|3": { "em": 0.014786073825503355, "em_stderr": 0.0012360366760472946, "f1": 0.08531564597315425, "f1_stderr": 0.001909056545524939 }, "harness|gsm8k|5": { "acc": 0.12206216830932524, "acc_stderr": 0.009017054965766495 }, "harness|winogrande|5": { "acc": 0.7655880031570639, "acc_stderr": 0.011906130106237986 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__StableBeluga-13B-instruct-PL-lora_unload
[ "region:us" ]
2023-08-18T10:40:01+00:00
{"pretty_name": "Evaluation run of Aspik101/StableBeluga-13B-instruct-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/StableBeluga-13B-instruct-PL-lora_unload](https://huggingface.co/Aspik101/StableBeluga-13B-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__StableBeluga-13B-instruct-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T13:27:06.134462](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__StableBeluga-13B-instruct-PL-lora_unload/blob/main/results_2023-10-13T13-27-06.134462.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760472946,\n \"f1\": 0.08531564597315425,\n \"f1_stderr\": 0.001909056545524939,\n \"acc\": 0.44382508573319457,\n \"acc_stderr\": 0.010461592536002241\n },\n \"harness|drop|3\": {\n \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760472946,\n \"f1\": 0.08531564597315425,\n \"f1_stderr\": 0.001909056545524939\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12206216830932524,\n \"acc_stderr\": 0.009017054965766495\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/StableBeluga-13B-instruct-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T13_27_06.134462", "path": ["**/details_harness|drop|3_2023-10-13T13-27-06.134462.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T13-27-06.134462.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T13_27_06.134462", "path": ["**/details_harness|gsm8k|5_2023-10-13T13-27-06.134462.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T13-27-06.134462.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:43:44.316126.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:43:44.316126.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:43:44.316126.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T13_27_06.134462", "path": ["**/details_harness|winogrande|5_2023-10-13T13-27-06.134462.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T13-27-06.134462.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_43_44.316126", "path": ["results_2023-08-09T11:43:44.316126.parquet"]}, {"split": "2023_10_13T13_27_06.134462", "path": ["results_2023-10-13T13-27-06.134462.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T13-27-06.134462.parquet"]}]}]}
2023-10-13T12:27:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/StableBeluga-13B-instruct-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/StableBeluga-13B-instruct-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T13:27:06.134462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/StableBeluga-13B-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/StableBeluga-13B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T13:27:06.134462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/StableBeluga-13B-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/StableBeluga-13B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T13:27:06.134462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/StableBeluga-13B-instruct-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/StableBeluga-13B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T13:27:06.134462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f5085b87a9a2e1449b0129f6e02900027be61abb
# Dataset Card for Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload](https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T06:39:08.245014](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload/blob/main/results_2023-09-17T06-39-08.245014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004404362416107382, "em_stderr": 0.0006781451620479503, "f1": 0.07459731543624175, "f1_stderr": 0.001589740038419953, "acc": 0.46844372186482186, "acc_stderr": 0.010745171507100412 }, "harness|drop|3": { "em": 0.004404362416107382, "em_stderr": 0.0006781451620479503, "f1": 0.07459731543624175, "f1_stderr": 0.001589740038419953 }, "harness|gsm8k|5": { "acc": 0.15314632297194844, "acc_stderr": 0.009919728152791473 }, "harness|winogrande|5": { "acc": 0.7837411207576953, "acc_stderr": 0.011570614861409348 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload
[ "region:us" ]
2023-08-18T10:40:15+00:00
{"pretty_name": "Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload](https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T06:39:08.245014](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload/blob/main/results_2023-09-17T06-39-08.245014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479503,\n \"f1\": 0.07459731543624175,\n \"f1_stderr\": 0.001589740038419953,\n \"acc\": 0.46844372186482186,\n \"acc_stderr\": 0.010745171507100412\n },\n \"harness|drop|3\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479503,\n \"f1\": 0.07459731543624175,\n \"f1_stderr\": 0.001589740038419953\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15314632297194844,\n \"acc_stderr\": 0.009919728152791473\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409348\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T06_39_08.245014", "path": ["**/details_harness|drop|3_2023-09-17T06-39-08.245014.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T06-39-08.245014.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T06_39_08.245014", "path": ["**/details_harness|gsm8k|5_2023-09-17T06-39-08.245014.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T06-39-08.245014.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:03:01.897575.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:03:01.897575.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T06_39_08.245014", "path": ["**/details_harness|winogrande|5_2023-09-17T06-39-08.245014.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T06-39-08.245014.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T14_03_01.897575", "path": ["results_2023-08-09T14:03:01.897575.parquet"]}, {"split": "2023_09_17T06_39_08.245014", "path": ["results_2023-09-17T06-39-08.245014.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T06-39-08.245014.parquet"]}]}]}
2023-09-17T05:39:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T06:39:08.245014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T06:39:08.245014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T06:39:08.245014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 34, 31, 182, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T06:39:08.245014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f02040f7a2e12229ada3f522d01f7a9b96e7cb04
# Dataset Card for Evaluation run of Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload](https://huggingface.co/Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__Llama-2-7b-hf-instruct-pl-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T16:09:05.436886](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Llama-2-7b-hf-instruct-pl-lora_unload/blob/main/results_2023-10-15T16-09-05.436886.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119025, "f1": 0.055425755033557136, "f1_stderr": 0.0012906670139037101, "acc": 0.4008552675276587, "acc_stderr": 0.00949293465826499 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119025, "f1": 0.055425755033557136, "f1_stderr": 0.0012906670139037101 }, "harness|gsm8k|5": { "acc": 0.0621683093252464, "acc_stderr": 0.00665103564453169 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.012334833671998289 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__Llama-2-7b-hf-instruct-pl-lora_unload
[ "region:us" ]
2023-08-18T10:41:01+00:00
{"pretty_name": "Evaluation run of Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload](https://huggingface.co/Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__Llama-2-7b-hf-instruct-pl-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T16:09:05.436886](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Llama-2-7b-hf-instruct-pl-lora_unload/blob/main/results_2023-10-15T16-09-05.436886.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119025,\n \"f1\": 0.055425755033557136,\n \"f1_stderr\": 0.0012906670139037101,\n \"acc\": 0.4008552675276587,\n \"acc_stderr\": 0.00949293465826499\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119025,\n \"f1\": 0.055425755033557136,\n \"f1_stderr\": 0.0012906670139037101\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0621683093252464,\n \"acc_stderr\": 0.00665103564453169\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998289\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T16_09_05.436886", "path": ["**/details_harness|drop|3_2023-10-15T16-09-05.436886.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T16-09-05.436886.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T16_09_05.436886", "path": ["**/details_harness|gsm8k|5_2023-10-15T16-09-05.436886.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T16-09-05.436886.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:00:24.420130.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:00:24.420130.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:00:24.420130.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T16_09_05.436886", "path": ["**/details_harness|winogrande|5_2023-10-15T16-09-05.436886.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T16-09-05.436886.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T10_00_24.420130", "path": ["results_2023-07-25T10:00:24.420130.parquet"]}, {"split": "2023_10_15T16_09_05.436886", "path": ["results_2023-10-15T16-09-05.436886.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T16-09-05.436886.parquet"]}]}]}
2023-10-15T15:09:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T16:09:05.436886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T16:09:05.436886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T16:09:05.436886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 32, 31, 180, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Llama-2-7b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T16:09:05.436886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6618d012287954f4f3dc2d28c2361a21f4a5c089
# Dataset Card for Evaluation run of Aspik101/vicuna-13b-v1.5-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/vicuna-13b-v1.5-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/vicuna-13b-v1.5-PL-lora_unload](https://huggingface.co/Aspik101/vicuna-13b-v1.5-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__vicuna-13b-v1.5-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T07:40:14.058498](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__vicuna-13b-v1.5-PL-lora_unload/blob/main/results_2023-10-18T07-40-14.058498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.005453020134228188, "em_stderr": 0.0007541727796792593, "f1": 0.071821518456376, "f1_stderr": 0.001594785727102056, "acc": 0.4374953999376486, "acc_stderr": 0.010587747311370011 }, "harness|drop|3": { "em": 0.005453020134228188, "em_stderr": 0.0007541727796792593, "f1": 0.071821518456376, "f1_stderr": 0.001594785727102056 }, "harness|gsm8k|5": { "acc": 0.12282031842304776, "acc_stderr": 0.009041108602874676 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.012134386019865348 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__vicuna-13b-v1.5-PL-lora_unload
[ "region:us" ]
2023-08-18T10:41:12+00:00
{"pretty_name": "Evaluation run of Aspik101/vicuna-13b-v1.5-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/vicuna-13b-v1.5-PL-lora_unload](https://huggingface.co/Aspik101/vicuna-13b-v1.5-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__vicuna-13b-v1.5-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T07:40:14.058498](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__vicuna-13b-v1.5-PL-lora_unload/blob/main/results_2023-10-18T07-40-14.058498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005453020134228188,\n \"em_stderr\": 0.0007541727796792593,\n \"f1\": 0.071821518456376,\n \"f1_stderr\": 0.001594785727102056,\n \"acc\": 0.4374953999376486,\n \"acc_stderr\": 0.010587747311370011\n },\n \"harness|drop|3\": {\n \"em\": 0.005453020134228188,\n \"em_stderr\": 0.0007541727796792593,\n \"f1\": 0.071821518456376,\n \"f1_stderr\": 0.001594785727102056\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \"acc_stderr\": 0.009041108602874676\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865348\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/vicuna-13b-v1.5-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T07_40_14.058498", "path": ["**/details_harness|drop|3_2023-10-18T07-40-14.058498.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T07-40-14.058498.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T07_40_14.058498", "path": ["**/details_harness|gsm8k|5_2023-10-18T07-40-14.058498.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T07-40-14.058498.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:01:43.778910.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:01:43.778910.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:01:43.778910.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T07_40_14.058498", "path": ["**/details_harness|winogrande|5_2023-10-18T07-40-14.058498.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T07-40-14.058498.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T10_01_43.778910", "path": ["results_2023-08-09T10:01:43.778910.parquet"]}, {"split": "2023_10_18T07_40_14.058498", "path": ["results_2023-10-18T07-40-14.058498.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T07-40-14.058498.parquet"]}]}]}
2023-10-18T06:40:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/vicuna-13b-v1.5-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/vicuna-13b-v1.5-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T07:40:14.058498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/vicuna-13b-v1.5-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/vicuna-13b-v1.5-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T07:40:14.058498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/vicuna-13b-v1.5-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/vicuna-13b-v1.5-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T07:40:14.058498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/vicuna-13b-v1.5-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/vicuna-13b-v1.5-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T07:40:14.058498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
275aa24f02a98809547e4b82beed605b5ecb6563
# Dataset Card for Evaluation run of Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload](https://huggingface.co/Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__Redmond-Puffin-13B-instruct-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T13:30:13.279131](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Redmond-Puffin-13B-instruct-PL-lora_unload/blob/main/results_2023-10-17T13-30-13.279131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054710113, "f1": 0.05736577181208053, "f1_stderr": 0.0013664047590611983, "acc": 0.43379799697577687, "acc_stderr": 0.010348919090911759 }, "harness|drop|3": { "em": 0.002936241610738255, "em_stderr": 0.0005541113054710113, "f1": 0.05736577181208053, "f1_stderr": 0.0013664047590611983 }, "harness|gsm8k|5": { "acc": 0.1106899166034875, "acc_stderr": 0.008642172551392473 }, "harness|winogrande|5": { "acc": 0.7569060773480663, "acc_stderr": 0.012055665630431044 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__Redmond-Puffin-13B-instruct-PL-lora_unload
[ "region:us" ]
2023-08-18T10:41:36+00:00
{"pretty_name": "Evaluation run of Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload](https://huggingface.co/Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__Redmond-Puffin-13B-instruct-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T13:30:13.279131](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Redmond-Puffin-13B-instruct-PL-lora_unload/blob/main/results_2023-10-17T13-30-13.279131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710113,\n \"f1\": 0.05736577181208053,\n \"f1_stderr\": 0.0013664047590611983,\n \"acc\": 0.43379799697577687,\n \"acc_stderr\": 0.010348919090911759\n },\n \"harness|drop|3\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710113,\n \"f1\": 0.05736577181208053,\n \"f1_stderr\": 0.0013664047590611983\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1106899166034875,\n \"acc_stderr\": 0.008642172551392473\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431044\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T13_30_13.279131", "path": ["**/details_harness|drop|3_2023-10-17T13-30-13.279131.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T13-30-13.279131.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T13_30_13.279131", "path": ["**/details_harness|gsm8k|5_2023-10-17T13-30-13.279131.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T13-30-13.279131.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:16:00.382833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:16:00.382833.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:16:00.382833.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T13_30_13.279131", "path": ["**/details_harness|winogrande|5_2023-10-17T13-30-13.279131.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T13-30-13.279131.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_16_00.382833", "path": ["results_2023-08-09T11:16:00.382833.parquet"]}, {"split": "2023_10_17T13_30_13.279131", "path": ["results_2023-10-17T13-30-13.279131.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T13-30-13.279131.parquet"]}]}]}
2023-10-17T12:30:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T13:30:13.279131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T13:30:13.279131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T13:30:13.279131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Redmond-Puffin-13B-instruct-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T13:30:13.279131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ea322875ead3cd216a8fcafd88fbf73b5e857a68
# Dataset Card for Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload](https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T04:12:47.025545](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload/blob/main/results_2023-09-23T04-12-47.025545.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002307046979865772, "em_stderr": 0.0004913221265094532, "f1": 0.05567638422818793, "f1_stderr": 0.001338509283292818, "acc": 0.38151825095307307, "acc_stderr": 0.009759837355311614 }, "harness|drop|3": { "em": 0.002307046979865772, "em_stderr": 0.0004913221265094532, "f1": 0.05567638422818793, "f1_stderr": 0.001338509283292818 }, "harness|gsm8k|5": { "acc": 0.0621683093252464, "acc_stderr": 0.00665103564453169 }, "harness|winogrande|5": { "acc": 0.7008681925808997, "acc_stderr": 0.012868639066091536 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload
[ "region:us" ]
2023-08-18T10:41:45+00:00
{"pretty_name": "Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload](https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T04:12:47.025545](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__vicuna-7b-v1.3-instruct-pl-lora_unload/blob/main/results_2023-09-23T04-12-47.025545.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094532,\n \"f1\": 0.05567638422818793,\n \"f1_stderr\": 0.001338509283292818,\n \"acc\": 0.38151825095307307,\n \"acc_stderr\": 0.009759837355311614\n },\n \"harness|drop|3\": {\n \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094532,\n \"f1\": 0.05567638422818793,\n \"f1_stderr\": 0.001338509283292818\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0621683093252464,\n \"acc_stderr\": 0.00665103564453169\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7008681925808997,\n \"acc_stderr\": 0.012868639066091536\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|arc:challenge|25_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T04_12_47.025545", "path": ["**/details_harness|drop|3_2023-09-23T04-12-47.025545.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T04-12-47.025545.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T04_12_47.025545", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-12-47.025545.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-12-47.025545.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hellaswag|10_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T09:51:14.882748.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T09:51:14.882748.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T04_12_47.025545", "path": ["**/details_harness|winogrande|5_2023-09-23T04-12-47.025545.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T04-12-47.025545.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T09_51_14.882748", "path": ["results_2023-07-25T09:51:14.882748.parquet"]}, {"split": "2023_09_23T04_12_47.025545", "path": ["results_2023-09-23T04-12-47.025545.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T04-12-47.025545.parquet"]}]}]}
2023-09-23T03:12:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T04:12:47.025545(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T04:12:47.025545(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T04:12:47.025545(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 31, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/vicuna-7b-v1.3-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T04:12:47.025545(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f1c26a6b48d3ad118c5e4c7ed33f6a4c8130a45d
# Dataset Card for Evaluation run of Aspik101/Nous-Hermes-13b-pl-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aspik101/Nous-Hermes-13b-pl-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aspik101/Nous-Hermes-13b-pl-lora_unload](https://huggingface.co/Aspik101/Nous-Hermes-13b-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aspik101__Nous-Hermes-13b-pl-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T05:49:33.504219](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Nous-Hermes-13b-pl-lora_unload/blob/main/results_2023-09-23T05-49-33.504219.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0350251677852349, "em_stderr": 0.0018827287598880225, "f1": 0.09843120805369145, "f1_stderr": 0.00232552209600243, "acc": 0.42825189253296936, "acc_stderr": 0.009957112862417898 }, "harness|drop|3": { "em": 0.0350251677852349, "em_stderr": 0.0018827287598880225, "f1": 0.09843120805369145, "f1_stderr": 0.00232552209600243 }, "harness|gsm8k|5": { "acc": 0.09249431387414708, "acc_stderr": 0.007980396874560173 }, "harness|winogrande|5": { "acc": 0.7640094711917916, "acc_stderr": 0.011933828850275625 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aspik101__Nous-Hermes-13b-pl-lora_unload
[ "region:us" ]
2023-08-18T10:41:54+00:00
{"pretty_name": "Evaluation run of Aspik101/Nous-Hermes-13b-pl-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aspik101/Nous-Hermes-13b-pl-lora_unload](https://huggingface.co/Aspik101/Nous-Hermes-13b-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__Nous-Hermes-13b-pl-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T05:49:33.504219](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Nous-Hermes-13b-pl-lora_unload/blob/main/results_2023-09-23T05-49-33.504219.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0350251677852349,\n \"em_stderr\": 0.0018827287598880225,\n \"f1\": 0.09843120805369145,\n \"f1_stderr\": 0.00232552209600243,\n \"acc\": 0.42825189253296936,\n \"acc_stderr\": 0.009957112862417898\n },\n \"harness|drop|3\": {\n \"em\": 0.0350251677852349,\n \"em_stderr\": 0.0018827287598880225,\n \"f1\": 0.09843120805369145,\n \"f1_stderr\": 0.00232552209600243\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09249431387414708,\n \"acc_stderr\": 0.007980396874560173\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n }\n}\n```", "repo_url": "https://huggingface.co/Aspik101/Nous-Hermes-13b-pl-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T05_49_33.504219", "path": ["**/details_harness|drop|3_2023-09-23T05-49-33.504219.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T05-49-33.504219.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T05_49_33.504219", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-49-33.504219.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-49-33.504219.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:05:32.801971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:05:32.801971.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:05:32.801971.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T05_49_33.504219", "path": ["**/details_harness|winogrande|5_2023-09-23T05-49-33.504219.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T05-49-33.504219.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T13_05_32.801971", "path": ["results_2023-07-24T13:05:32.801971.parquet"]}, {"split": "2023_09_23T05_49_33.504219", "path": ["results_2023-09-23T05-49-33.504219.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T05-49-33.504219.parquet"]}]}]}
2023-09-23T04:49:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aspik101/Nous-Hermes-13b-pl-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aspik101/Nous-Hermes-13b-pl-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T05:49:33.504219(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aspik101/Nous-Hermes-13b-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Nous-Hermes-13b-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:49:33.504219(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aspik101/Nous-Hermes-13b-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Nous-Hermes-13b-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:49:33.504219(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aspik101/Nous-Hermes-13b-pl-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aspik101/Nous-Hermes-13b-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T05:49:33.504219(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ff550ef905197700fd06e5ebc89076c0f347ec3d
# Dataset Card for Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [wahaha1987/llama_13b_sharegpt94k_fastchat](https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T01:15:25.210552](https://huggingface.co/datasets/open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat/blob/main/results_2023-10-13T01-15-25.210552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07109899328859061, "em_stderr": 0.0026318194599633114, "f1": 0.13432151845637572, "f1_stderr": 0.0028813877533664808, "acc": 0.40513968332422795, "acc_stderr": 0.010090158389611751 }, "harness|drop|3": { "em": 0.07109899328859061, "em_stderr": 0.0026318194599633114, "f1": 0.13432151845637572, "f1_stderr": 0.0028813877533664808 }, "harness|gsm8k|5": { "acc": 0.0841546626231994, "acc_stderr": 0.007647024046603203 }, "harness|winogrande|5": { "acc": 0.7261247040252565, "acc_stderr": 0.012533292732620297 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat
[ "region:us" ]
2023-08-18T10:42:03+00:00
{"pretty_name": "Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat", "dataset_summary": "Dataset automatically created during the evaluation run of model [wahaha1987/llama_13b_sharegpt94k_fastchat](https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T01:15:25.210552](https://huggingface.co/datasets/open-llm-leaderboard/details_wahaha1987__llama_13b_sharegpt94k_fastchat/blob/main/results_2023-10-13T01-15-25.210552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07109899328859061,\n \"em_stderr\": 0.0026318194599633114,\n \"f1\": 0.13432151845637572,\n \"f1_stderr\": 0.0028813877533664808,\n \"acc\": 0.40513968332422795,\n \"acc_stderr\": 0.010090158389611751\n },\n \"harness|drop|3\": {\n \"em\": 0.07109899328859061,\n \"em_stderr\": 0.0026318194599633114,\n \"f1\": 0.13432151845637572,\n \"f1_stderr\": 0.0028813877533664808\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0841546626231994,\n \"acc_stderr\": 0.007647024046603203\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620297\n }\n}\n```", "repo_url": "https://huggingface.co/wahaha1987/llama_13b_sharegpt94k_fastchat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T01_15_25.210552", "path": ["**/details_harness|drop|3_2023-10-13T01-15-25.210552.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T01-15-25.210552.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T01_15_25.210552", "path": ["**/details_harness|gsm8k|5_2023-10-13T01-15-25.210552.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T01-15-25.210552.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:35:52.707765.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:35:52.707765.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T01_15_25.210552", "path": ["**/details_harness|winogrande|5_2023-10-13T01-15-25.210552.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T01-15-25.210552.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_35_52.707765", "path": ["results_2023-07-19T18:35:52.707765.parquet"]}, {"split": "2023_10_13T01_15_25.210552", "path": ["results_2023-10-13T01-15-25.210552.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T01-15-25.210552.parquet"]}]}]}
2023-10-13T00:15:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model wahaha1987/llama_13b_sharegpt94k_fastchat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T01:15:25.210552(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model wahaha1987/llama_13b_sharegpt94k_fastchat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T01:15:25.210552(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model wahaha1987/llama_13b_sharegpt94k_fastchat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T01:15:25.210552(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of wahaha1987/llama_13b_sharegpt94k_fastchat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model wahaha1987/llama_13b_sharegpt94k_fastchat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T01:15:25.210552(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1029db3968de33cd27b4e84362a21290715c21c4
# Dataset Card for Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ogimgio/gpt-neo-125m-neurallinguisticpioneers](https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T03:45:38.309218](https://huggingface.co/datasets/open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers/blob/main/results_2023-10-25T03-45-38.309218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0024119127516778523, "em_stderr": 0.0005023380498893281, "f1": 0.041171350671141034, "f1_stderr": 0.0012117312202057759, "acc": 0.2564958864222914, "acc_stderr": 0.007403214467064075 }, "harness|drop|3": { "em": 0.0024119127516778523, "em_stderr": 0.0005023380498893281, "f1": 0.041171350671141034, "f1_stderr": 0.0012117312202057759 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.0007581501137225334 }, "harness|winogrande|5": { "acc": 0.5122336227308603, "acc_stderr": 0.014048278820405616 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers
[ "region:us" ]
2023-08-18T10:42:14+00:00
{"pretty_name": "Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers", "dataset_summary": "Dataset automatically created during the evaluation run of model [ogimgio/gpt-neo-125m-neurallinguisticpioneers](https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T03:45:38.309218](https://huggingface.co/datasets/open-llm-leaderboard/details_ogimgio__gpt-neo-125m-neurallinguisticpioneers/blob/main/results_2023-10-25T03-45-38.309218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893281,\n \"f1\": 0.041171350671141034,\n \"f1_stderr\": 0.0012117312202057759,\n \"acc\": 0.2564958864222914,\n \"acc_stderr\": 0.007403214467064075\n },\n \"harness|drop|3\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893281,\n \"f1\": 0.041171350671141034,\n \"f1_stderr\": 0.0012117312202057759\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225334\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5122336227308603,\n \"acc_stderr\": 0.014048278820405616\n }\n}\n```", "repo_url": "https://huggingface.co/ogimgio/gpt-neo-125m-neurallinguisticpioneers", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T03_45_38.309218", "path": ["**/details_harness|drop|3_2023-10-25T03-45-38.309218.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T03-45-38.309218.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T03_45_38.309218", "path": ["**/details_harness|gsm8k|5_2023-10-25T03-45-38.309218.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T03-45-38.309218.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:56:53.861726.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:56:53.861726.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T03_45_38.309218", "path": ["**/details_harness|winogrande|5_2023-10-25T03-45-38.309218.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T03-45-38.309218.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_56_53.861726", "path": ["results_2023-07-19T13:56:53.861726.parquet"]}, {"split": "2023_10_25T03_45_38.309218", "path": ["results_2023-10-25T03-45-38.309218.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T03-45-38.309218.parquet"]}]}]}
2023-10-25T02:45:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ogimgio/gpt-neo-125m-neurallinguisticpioneers on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T03:45:38.309218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ogimgio/gpt-neo-125m-neurallinguisticpioneers on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T03:45:38.309218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ogimgio/gpt-neo-125m-neurallinguisticpioneers on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T03:45:38.309218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ogimgio/gpt-neo-125m-neurallinguisticpioneers## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ogimgio/gpt-neo-125m-neurallinguisticpioneers on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T03:45:38.309218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4c96d638cd151bedd085de886b8b5c928342bcf0
# Dataset Card for Evaluation run of vonjack/Qwen-LLaMAfied-HFTok-7B-Chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/vonjack/Qwen-LLaMAfied-HFTok-7B-Chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [vonjack/Qwen-LLaMAfied-HFTok-7B-Chat](https://huggingface.co/vonjack/Qwen-LLaMAfied-HFTok-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vonjack__Qwen-LLaMAfied-HFTok-7B-Chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T03:24:19.085835](https://huggingface.co/datasets/open-llm-leaderboard/details_vonjack__Qwen-LLaMAfied-HFTok-7B-Chat/blob/main/results_2023-09-17T03-24-19.085835.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2769505033557047, "em_stderr": 0.004582735002079869, "f1": 0.35700713087248354, "f1_stderr": 0.004479366355977497, "acc": 0.36965233401927866, "acc_stderr": 0.008499288458301458 }, "harness|drop|3": { "em": 0.2769505033557047, "em_stderr": 0.004582735002079869, "f1": 0.35700713087248354, "f1_stderr": 0.004479366355977497 }, "harness|gsm8k|5": { "acc": 0.025018953752843062, "acc_stderr": 0.004302045046564296 }, "harness|winogrande|5": { "acc": 0.7142857142857143, "acc_stderr": 0.012696531870038621 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_vonjack__Qwen-LLaMAfied-HFTok-7B-Chat
[ "region:us" ]
2023-08-18T10:42:23+00:00
{"pretty_name": "Evaluation run of vonjack/Qwen-LLaMAfied-HFTok-7B-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [vonjack/Qwen-LLaMAfied-HFTok-7B-Chat](https://huggingface.co/vonjack/Qwen-LLaMAfied-HFTok-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vonjack__Qwen-LLaMAfied-HFTok-7B-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T03:24:19.085835](https://huggingface.co/datasets/open-llm-leaderboard/details_vonjack__Qwen-LLaMAfied-HFTok-7B-Chat/blob/main/results_2023-09-17T03-24-19.085835.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2769505033557047,\n \"em_stderr\": 0.004582735002079869,\n \"f1\": 0.35700713087248354,\n \"f1_stderr\": 0.004479366355977497,\n \"acc\": 0.36965233401927866,\n \"acc_stderr\": 0.008499288458301458\n },\n \"harness|drop|3\": {\n \"em\": 0.2769505033557047,\n \"em_stderr\": 0.004582735002079869,\n \"f1\": 0.35700713087248354,\n \"f1_stderr\": 0.004479366355977497\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \"acc_stderr\": 0.004302045046564296\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038621\n }\n}\n```", "repo_url": "https://huggingface.co/vonjack/Qwen-LLaMAfied-HFTok-7B-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T03_24_19.085835", "path": ["**/details_harness|drop|3_2023-09-17T03-24-19.085835.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T03-24-19.085835.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T03_24_19.085835", "path": ["**/details_harness|gsm8k|5_2023-09-17T03-24-19.085835.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T03-24-19.085835.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:46:26.014889.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:46:26.014889.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:46:26.014889.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T03_24_19.085835", "path": ["**/details_harness|winogrande|5_2023-09-17T03-24-19.085835.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T03-24-19.085835.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T20_46_26.014889", "path": ["results_2023-08-09T20:46:26.014889.parquet"]}, {"split": "2023_09_17T03_24_19.085835", "path": ["results_2023-09-17T03-24-19.085835.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T03-24-19.085835.parquet"]}]}]}
2023-09-17T02:24:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vonjack/Qwen-LLaMAfied-HFTok-7B-Chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model vonjack/Qwen-LLaMAfied-HFTok-7B-Chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T03:24:19.085835(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of vonjack/Qwen-LLaMAfied-HFTok-7B-Chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model vonjack/Qwen-LLaMAfied-HFTok-7B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T03:24:19.085835(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vonjack/Qwen-LLaMAfied-HFTok-7B-Chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model vonjack/Qwen-LLaMAfied-HFTok-7B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T03:24:19.085835(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vonjack/Qwen-LLaMAfied-HFTok-7B-Chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vonjack/Qwen-LLaMAfied-HFTok-7B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T03:24:19.085835(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2b840d6f73e51b09dd8f14ca52e8d72d8d733bdc
# Dataset Card for Evaluation run of Austism/chronos-hermes-13b-v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Austism/chronos-hermes-13b-v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Austism/chronos-hermes-13b-v2](https://huggingface.co/Austism/chronos-hermes-13b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Austism__chronos-hermes-13b-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T05:48:56.060288](https://huggingface.co/datasets/open-llm-leaderboard/details_Austism__chronos-hermes-13b-v2/blob/main/results_2023-09-23T05-48-56.060288.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.021707214765100673, "em_stderr": 0.001492368687400628, "f1": 0.08727139261744986, "f1_stderr": 0.002006428881176311, "acc": 0.4356311405222559, "acc_stderr": 0.010489348281963759 }, "harness|drop|3": { "em": 0.021707214765100673, "em_stderr": 0.001492368687400628, "f1": 0.08727139261744986, "f1_stderr": 0.002006428881176311 }, "harness|gsm8k|5": { "acc": 0.11751326762699014, "acc_stderr": 0.008870331256489995 }, "harness|winogrande|5": { "acc": 0.7537490134175217, "acc_stderr": 0.012108365307437523 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Austism__chronos-hermes-13b-v2
[ "region:us" ]
2023-08-18T10:42:33+00:00
{"pretty_name": "Evaluation run of Austism/chronos-hermes-13b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Austism/chronos-hermes-13b-v2](https://huggingface.co/Austism/chronos-hermes-13b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Austism__chronos-hermes-13b-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T05:48:56.060288](https://huggingface.co/datasets/open-llm-leaderboard/details_Austism__chronos-hermes-13b-v2/blob/main/results_2023-09-23T05-48-56.060288.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.021707214765100673,\n \"em_stderr\": 0.001492368687400628,\n \"f1\": 0.08727139261744986,\n \"f1_stderr\": 0.002006428881176311,\n \"acc\": 0.4356311405222559,\n \"acc_stderr\": 0.010489348281963759\n },\n \"harness|drop|3\": {\n \"em\": 0.021707214765100673,\n \"em_stderr\": 0.001492368687400628,\n \"f1\": 0.08727139261744986,\n \"f1_stderr\": 0.002006428881176311\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \"acc_stderr\": 0.008870331256489995\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437523\n }\n}\n```", "repo_url": "https://huggingface.co/Austism/chronos-hermes-13b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T05_48_56.060288", "path": ["**/details_harness|drop|3_2023-09-23T05-48-56.060288.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T05-48-56.060288.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T05_48_56.060288", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-48-56.060288.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-48-56.060288.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:43.363551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:24:43.363551.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:24:43.363551.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T05_48_56.060288", "path": ["**/details_harness|winogrande|5_2023-09-23T05-48-56.060288.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T05-48-56.060288.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T10_24_43.363551", "path": ["results_2023-08-09T10:24:43.363551.parquet"]}, {"split": "2023_09_23T05_48_56.060288", "path": ["results_2023-09-23T05-48-56.060288.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T05-48-56.060288.parquet"]}]}]}
2023-09-23T04:49:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Austism/chronos-hermes-13b-v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Austism/chronos-hermes-13b-v2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T05:48:56.060288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Austism/chronos-hermes-13b-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Austism/chronos-hermes-13b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:48:56.060288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Austism/chronos-hermes-13b-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Austism/chronos-hermes-13b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:48:56.060288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Austism/chronos-hermes-13b-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Austism/chronos-hermes-13b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T05:48:56.060288(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
85b502ed5db3de6e592f7067880a9e211c042dc6
# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [LoupGarou/WizardCoder-Guanaco-15B-V1.1](https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T15:52:30.106380](https://huggingface.co/datasets/open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.1/blob/main/results_2023-09-22T15-52-30.106380.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.10580956375838926, "em_stderr": 0.003150047651575815, "f1": 0.16983640939597303, "f1_stderr": 0.0033726302998826852, "acc": 0.2945942759965605, "acc_stderr": 0.009278567029891577 }, "harness|drop|3": { "em": 0.10580956375838926, "em_stderr": 0.003150047651575815, "f1": 0.16983640939597303, "f1_stderr": 0.0033726302998826852 }, "harness|gsm8k|5": { "acc": 0.02880970432145565, "acc_stderr": 0.004607484283767452 }, "harness|winogrande|5": { "acc": 0.5603788476716653, "acc_stderr": 0.013949649776015703 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.1
[ "region:us" ]
2023-08-18T10:42:43+00:00
{"pretty_name": "Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [LoupGarou/WizardCoder-Guanaco-15B-V1.1](https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T15:52:30.106380](https://huggingface.co/datasets/open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.1/blob/main/results_2023-09-22T15-52-30.106380.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10580956375838926,\n \"em_stderr\": 0.003150047651575815,\n \"f1\": 0.16983640939597303,\n \"f1_stderr\": 0.0033726302998826852,\n \"acc\": 0.2945942759965605,\n \"acc_stderr\": 0.009278567029891577\n },\n \"harness|drop|3\": {\n \"em\": 0.10580956375838926,\n \"em_stderr\": 0.003150047651575815,\n \"f1\": 0.16983640939597303,\n \"f1_stderr\": 0.0033726302998826852\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02880970432145565,\n \"acc_stderr\": 0.004607484283767452\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5603788476716653,\n \"acc_stderr\": 0.013949649776015703\n }\n}\n```", "repo_url": "https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T15_52_30.106380", "path": ["**/details_harness|drop|3_2023-09-22T15-52-30.106380.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T15-52-30.106380.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T15_52_30.106380", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-52-30.106380.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-52-30.106380.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:04:47.997241.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:04:47.997241.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:04:47.997241.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T15_52_30.106380", "path": ["**/details_harness|winogrande|5_2023-09-22T15-52-30.106380.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T15-52-30.106380.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T21_04_47.997241", "path": ["results_2023-07-19T21:04:47.997241.parquet"]}, {"split": "2023_09_22T15_52_30.106380", "path": ["results_2023-09-22T15-52-30.106380.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T15-52-30.106380.parquet"]}]}]}
2023-09-22T14:52:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T15:52:30.106380(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:52:30.106380(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:52:30.106380(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T15:52:30.106380(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7add0bcbd4af73319195600cb94228f937f69e13
# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [LoupGarou/WizardCoder-Guanaco-15B-V1.0](https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T04:03:10.692358](https://huggingface.co/datasets/open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0/blob/main/results_2023-09-23T04-03-10.692358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.04089765100671141, "em_stderr": 0.0020282491887764946, "f1": 0.08708682885906038, "f1_stderr": 0.002301893268858503, "acc": 0.27279042923742786, "acc_stderr": 0.008653599278888232 }, "harness|drop|3": { "em": 0.04089765100671141, "em_stderr": 0.0020282491887764946, "f1": 0.08708682885906038, "f1_stderr": 0.002301893268858503 }, "harness|gsm8k|5": { "acc": 0.014404852160727824, "acc_stderr": 0.0032820559171369513 }, "harness|winogrande|5": { "acc": 0.5311760063141279, "acc_stderr": 0.014025142640639511 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0
[ "region:us" ]
2023-08-18T10:42:53+00:00
{"pretty_name": "Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [LoupGarou/WizardCoder-Guanaco-15B-V1.0](https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T04:03:10.692358](https://huggingface.co/datasets/open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0/blob/main/results_2023-09-23T04-03-10.692358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04089765100671141,\n \"em_stderr\": 0.0020282491887764946,\n \"f1\": 0.08708682885906038,\n \"f1_stderr\": 0.002301893268858503,\n \"acc\": 0.27279042923742786,\n \"acc_stderr\": 0.008653599278888232\n },\n \"harness|drop|3\": {\n \"em\": 0.04089765100671141,\n \"em_stderr\": 0.0020282491887764946,\n \"f1\": 0.08708682885906038,\n \"f1_stderr\": 0.002301893268858503\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.0032820559171369513\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5311760063141279,\n \"acc_stderr\": 0.014025142640639511\n }\n}\n```", "repo_url": "https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|arc:challenge|25_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T04_03_10.692358", "path": ["**/details_harness|drop|3_2023-09-23T04-03-10.692358.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T04-03-10.692358.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T04_03_10.692358", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-03-10.692358.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-03-10.692358.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hellaswag|10_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T16:55:06.473074.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T16:55:06.473074.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T04_03_10.692358", "path": ["**/details_harness|winogrande|5_2023-09-23T04-03-10.692358.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T04-03-10.692358.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T16_55_06.473074", "path": ["results_2023-07-24T16:55:06.473074.parquet"]}, {"split": "2023_09_23T04_03_10.692358", "path": ["results_2023-09-23T04-03-10.692358.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T04-03-10.692358.parquet"]}]}]}
2023-09-23T03:03:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.0 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T04:03:10.692358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T04:03:10.692358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T04:03:10.692358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model LoupGarou/WizardCoder-Guanaco-15B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T04:03:10.692358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4a8c4b6bfb3230905d7ef70036c4e0b36914cd2e
# Dataset Card for Evaluation run of ewof/koishi-instruct-3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ewof/koishi-instruct-3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ewof/koishi-instruct-3b](https://huggingface.co/ewof/koishi-instruct-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ewof__koishi-instruct-3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T08:44:21.498764](https://huggingface.co/datasets/open-llm-leaderboard/details_ewof__koishi-instruct-3b/blob/main/results_2023-09-17T08-44-21.498764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857095, "f1": 0.05410444630872499, "f1_stderr": 0.0012841997819823922, "acc": 0.32612811480319515, "acc_stderr": 0.008201890700454486 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857095, "f1": 0.05410444630872499, "f1_stderr": 0.0012841997819823922 }, "harness|gsm8k|5": { "acc": 0.011372251705837756, "acc_stderr": 0.002920666198788737 }, "harness|winogrande|5": { "acc": 0.6408839779005525, "acc_stderr": 0.013483115202120236 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ewof__koishi-instruct-3b
[ "region:us" ]
2023-08-18T10:43:02+00:00
{"pretty_name": "Evaluation run of ewof/koishi-instruct-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewof/koishi-instruct-3b](https://huggingface.co/ewof/koishi-instruct-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewof__koishi-instruct-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T08:44:21.498764](https://huggingface.co/datasets/open-llm-leaderboard/details_ewof__koishi-instruct-3b/blob/main/results_2023-09-17T08-44-21.498764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05410444630872499,\n \"f1_stderr\": 0.0012841997819823922,\n \"acc\": 0.32612811480319515,\n \"acc_stderr\": 0.008201890700454486\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05410444630872499,\n \"f1_stderr\": 0.0012841997819823922\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.002920666198788737\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120236\n }\n}\n```", "repo_url": "https://huggingface.co/ewof/koishi-instruct-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T08_44_21.498764", "path": ["**/details_harness|drop|3_2023-09-17T08-44-21.498764.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T08-44-21.498764.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T08_44_21.498764", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-44-21.498764.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-44-21.498764.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:49:25.234956.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:49:25.234956.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T08_44_21.498764", "path": ["**/details_harness|winogrande|5_2023-09-17T08-44-21.498764.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T08-44-21.498764.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_49_25.234956", "path": ["results_2023-07-19T14:49:25.234956.parquet"]}, {"split": "2023_09_17T08_44_21.498764", "path": ["results_2023-09-17T08-44-21.498764.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T08-44-21.498764.parquet"]}]}]}
2023-09-17T07:44:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ewof/koishi-instruct-3b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ewof/koishi-instruct-3b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T08:44:21.498764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ewof/koishi-instruct-3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ewof/koishi-instruct-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T08:44:21.498764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ewof/koishi-instruct-3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ewof/koishi-instruct-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T08:44:21.498764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ewof/koishi-instruct-3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ewof/koishi-instruct-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T08:44:21.498764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
29355d5172b37c1f911253d71091450fed3844cb
# Dataset Card for Evaluation run of euclaise/gpt-neox-122m-minipile-digits ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [euclaise/gpt-neox-122m-minipile-digits](https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T13:51:30.117179](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits/blob/main/results_2023-09-22T13-51-30.117179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0010486577181208054, "em_stderr": 0.00033145814652193507, "f1": 0.010940645973154372, "f1_stderr": 0.0006046286914074817, "acc": 0.2616416732438832, "acc_stderr": 0.007018620654786819 }, "harness|drop|3": { "em": 0.0010486577181208054, "em_stderr": 0.00033145814652193507, "f1": 0.010940645973154372, "f1_stderr": 0.0006046286914074817 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5232833464877664, "acc_stderr": 0.014037241309573638 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits
[ "region:us" ]
2023-08-18T10:43:11+00:00
{"pretty_name": "Evaluation run of euclaise/gpt-neox-122m-minipile-digits", "dataset_summary": "Dataset automatically created during the evaluation run of model [euclaise/gpt-neox-122m-minipile-digits](https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T13:51:30.117179](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__gpt-neox-122m-minipile-digits/blob/main/results_2023-09-22T13-51-30.117179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652193507,\n \"f1\": 0.010940645973154372,\n \"f1_stderr\": 0.0006046286914074817,\n \"acc\": 0.2616416732438832,\n \"acc_stderr\": 0.007018620654786819\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652193507,\n \"f1\": 0.010940645973154372,\n \"f1_stderr\": 0.0006046286914074817\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5232833464877664,\n \"acc_stderr\": 0.014037241309573638\n }\n}\n```", "repo_url": "https://huggingface.co/euclaise/gpt-neox-122m-minipile-digits", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T13_51_30.117179", "path": ["**/details_harness|drop|3_2023-09-22T13-51-30.117179.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T13-51-30.117179.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T13_51_30.117179", "path": ["**/details_harness|gsm8k|5_2023-09-22T13-51-30.117179.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T13-51-30.117179.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:54:44.863431.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:54:44.863431.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T13_51_30.117179", "path": ["**/details_harness|winogrande|5_2023-09-22T13-51-30.117179.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T13-51-30.117179.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_54_44.863431", "path": ["results_2023-07-19T13:54:44.863431.parquet"]}, {"split": "2023_09_22T13_51_30.117179", "path": ["results_2023-09-22T13-51-30.117179.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T13-51-30.117179.parquet"]}]}]}
2023-09-22T12:51:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of euclaise/gpt-neox-122m-minipile-digits ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model euclaise/gpt-neox-122m-minipile-digits on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T13:51:30.117179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of euclaise/gpt-neox-122m-minipile-digits", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/gpt-neox-122m-minipile-digits on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T13:51:30.117179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of euclaise/gpt-neox-122m-minipile-digits", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/gpt-neox-122m-minipile-digits on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T13:51:30.117179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of euclaise/gpt-neox-122m-minipile-digits## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/gpt-neox-122m-minipile-digits on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T13:51:30.117179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1753f8f0a8912268845c9cfc5ed6305aebfd7d57
# Dataset Card for Evaluation run of dsvv-cair/alpaca-cleaned-llama-30b-bf16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/dsvv-cair/alpaca-cleaned-llama-30b-bf16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [dsvv-cair/alpaca-cleaned-llama-30b-bf16](https://huggingface.co/dsvv-cair/alpaca-cleaned-llama-30b-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dsvv-cair__alpaca-cleaned-llama-30b-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T20:32:42.598667](https://huggingface.co/datasets/open-llm-leaderboard/details_dsvv-cair__alpaca-cleaned-llama-30b-bf16/blob/main/results_2023-09-22T20-32-42.598667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.032508389261744965, "em_stderr": 0.0018161887490111502, "f1": 0.09911283557047008, "f1_stderr": 0.0022624364500590114, "acc": 0.4254059872915611, "acc_stderr": 0.009560931288960338 }, "harness|drop|3": { "em": 0.032508389261744965, "em_stderr": 0.0018161887490111502, "f1": 0.09911283557047008, "f1_stderr": 0.0022624364500590114 }, "harness|gsm8k|5": { "acc": 0.07733131159969674, "acc_stderr": 0.007357713523222344 }, "harness|winogrande|5": { "acc": 0.7734806629834254, "acc_stderr": 0.011764149054698332 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_dsvv-cair__alpaca-cleaned-llama-30b-bf16
[ "region:us" ]
2023-08-18T10:43:19+00:00
{"pretty_name": "Evaluation run of dsvv-cair/alpaca-cleaned-llama-30b-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [dsvv-cair/alpaca-cleaned-llama-30b-bf16](https://huggingface.co/dsvv-cair/alpaca-cleaned-llama-30b-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dsvv-cair__alpaca-cleaned-llama-30b-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T20:32:42.598667](https://huggingface.co/datasets/open-llm-leaderboard/details_dsvv-cair__alpaca-cleaned-llama-30b-bf16/blob/main/results_2023-09-22T20-32-42.598667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.032508389261744965,\n \"em_stderr\": 0.0018161887490111502,\n \"f1\": 0.09911283557047008,\n \"f1_stderr\": 0.0022624364500590114,\n \"acc\": 0.4254059872915611,\n \"acc_stderr\": 0.009560931288960338\n },\n \"harness|drop|3\": {\n \"em\": 0.032508389261744965,\n \"em_stderr\": 0.0018161887490111502,\n \"f1\": 0.09911283557047008,\n \"f1_stderr\": 0.0022624364500590114\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \"acc_stderr\": 0.007357713523222344\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698332\n }\n}\n```", "repo_url": "https://huggingface.co/dsvv-cair/alpaca-cleaned-llama-30b-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T20_32_42.598667", "path": ["**/details_harness|drop|3_2023-09-22T20-32-42.598667.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T20-32-42.598667.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T20_32_42.598667", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-32-42.598667.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-32-42.598667.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:14:09.885019.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:14:09.885019.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:14:09.885019.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T20_32_42.598667", "path": ["**/details_harness|winogrande|5_2023-09-22T20-32-42.598667.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T20-32-42.598667.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_14_09.885019", "path": ["results_2023-07-19T22:14:09.885019.parquet"]}, {"split": "2023_09_22T20_32_42.598667", "path": ["results_2023-09-22T20-32-42.598667.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T20-32-42.598667.parquet"]}]}]}
2023-09-22T19:32:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dsvv-cair/alpaca-cleaned-llama-30b-bf16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model dsvv-cair/alpaca-cleaned-llama-30b-bf16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T20:32:42.598667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of dsvv-cair/alpaca-cleaned-llama-30b-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dsvv-cair/alpaca-cleaned-llama-30b-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T20:32:42.598667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dsvv-cair/alpaca-cleaned-llama-30b-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model dsvv-cair/alpaca-cleaned-llama-30b-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T20:32:42.598667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dsvv-cair/alpaca-cleaned-llama-30b-bf16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dsvv-cair/alpaca-cleaned-llama-30b-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T20:32:42.598667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
277e68cf7db85ddaea566e0ef86a9fc7682932f8
# Dataset Card for Evaluation run of AlpinDale/pygmalion-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/AlpinDale/pygmalion-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [AlpinDale/pygmalion-instruct](https://huggingface.co/AlpinDale/pygmalion-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AlpinDale__pygmalion-instruct", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-03T16:32:30.526837](https://huggingface.co/datasets/open-llm-leaderboard/details_AlpinDale__pygmalion-instruct/blob/main/results_2023-12-03T16-32-30.526837.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.09855951478392722, "acc_stderr": 0.008210320350946347 }, "harness|gsm8k|5": { "acc": 0.09855951478392722, "acc_stderr": 0.008210320350946347 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_AlpinDale__pygmalion-instruct
[ "region:us" ]
2023-08-18T10:43:28+00:00
{"pretty_name": "Evaluation run of AlpinDale/pygmalion-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [AlpinDale/pygmalion-instruct](https://huggingface.co/AlpinDale/pygmalion-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AlpinDale__pygmalion-instruct\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T16:32:30.526837](https://huggingface.co/datasets/open-llm-leaderboard/details_AlpinDale__pygmalion-instruct/blob/main/results_2023-12-03T16-32-30.526837.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.09855951478392722,\n \"acc_stderr\": 0.008210320350946347\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09855951478392722,\n \"acc_stderr\": 0.008210320350946347\n }\n}\n```", "repo_url": "https://huggingface.co/AlpinDale/pygmalion-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|arc:challenge|25_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_09T13_40_12.695061", "path": ["**/details_harness|drop|3_2023-09-09T13-40-12.695061.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-09T13-40-12.695061.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_09T13_40_12.695061", "path": ["**/details_harness|gsm8k|5_2023-09-09T13-40-12.695061.parquet"]}, {"split": "2023_12_03T16_32_30.526837", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-32-30.526837.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-32-30.526837.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hellaswag|10_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T11:20:15.687659.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T11:20:15.687659.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T11:20:15.687659.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_09T13_40_12.695061", "path": ["**/details_harness|winogrande|5_2023-09-09T13-40-12.695061.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-09T13-40-12.695061.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T11_20_15.687659", "path": ["results_2023-08-17T11:20:15.687659.parquet"]}, {"split": "2023_09_09T13_40_12.695061", "path": ["results_2023-09-09T13-40-12.695061.parquet"]}, {"split": "2023_12_03T16_32_30.526837", "path": ["results_2023-12-03T16-32-30.526837.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T16-32-30.526837.parquet"]}]}]}
2023-12-03T16:32:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AlpinDale/pygmalion-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model AlpinDale/pygmalion-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-03T16:32:30.526837(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of AlpinDale/pygmalion-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlpinDale/pygmalion-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T16:32:30.526837(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AlpinDale/pygmalion-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlpinDale/pygmalion-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T16:32:30.526837(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AlpinDale/pygmalion-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlpinDale/pygmalion-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T16:32:30.526837(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
47fe81fda806624cf1b5143c79a0e8a55b152975
# Dataset Card for Evaluation run of project-baize/baize-v2-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/project-baize/baize-v2-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [project-baize/baize-v2-13b](https://huggingface.co/project-baize/baize-v2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_project-baize__baize-v2-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-12T16:13:42.802156](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-v2-13b/blob/main/results_2023-10-12T16-13-42.802156.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0018875838926174498, "em_stderr": 0.0004445109990559112, "f1": 0.06463716442953052, "f1_stderr": 0.0014224354445974106, "acc": 0.4192375654704809, "acc_stderr": 0.01002367963522793 }, "harness|drop|3": { "em": 0.0018875838926174498, "em_stderr": 0.0004445109990559112, "f1": 0.06463716442953052, "f1_stderr": 0.0014224354445974106 }, "harness|gsm8k|5": { "acc": 0.08946171341925702, "acc_stderr": 0.0078615830499397 }, "harness|winogrande|5": { "acc": 0.7490134175217048, "acc_stderr": 0.012185776220516161 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_project-baize__baize-v2-13b
[ "region:us" ]
2023-08-18T10:43:39+00:00
{"pretty_name": "Evaluation run of project-baize/baize-v2-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [project-baize/baize-v2-13b](https://huggingface.co/project-baize/baize-v2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_project-baize__baize-v2-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T16:13:42.802156](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-v2-13b/blob/main/results_2023-10-12T16-13-42.802156.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.0004445109990559112,\n \"f1\": 0.06463716442953052,\n \"f1_stderr\": 0.0014224354445974106,\n \"acc\": 0.4192375654704809,\n \"acc_stderr\": 0.01002367963522793\n },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.0004445109990559112,\n \"f1\": 0.06463716442953052,\n \"f1_stderr\": 0.0014224354445974106\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08946171341925702,\n \"acc_stderr\": 0.0078615830499397\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516161\n }\n}\n```", "repo_url": "https://huggingface.co/project-baize/baize-v2-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T16_13_42.802156", "path": ["**/details_harness|drop|3_2023-10-12T16-13-42.802156.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T16-13-42.802156.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T16_13_42.802156", "path": ["**/details_harness|gsm8k|5_2023-10-12T16-13-42.802156.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T16-13-42.802156.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:42:29.519016.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:42:29.519016.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:42:29.519016.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T16_13_42.802156", "path": ["**/details_harness|winogrande|5_2023-10-12T16-13-42.802156.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T16-13-42.802156.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T16_42_29.519016", "path": ["results_2023-07-18T16:42:29.519016.parquet"]}, {"split": "2023_10_12T16_13_42.802156", "path": ["results_2023-10-12T16-13-42.802156.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T16-13-42.802156.parquet"]}]}]}
2023-10-12T15:13:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of project-baize/baize-v2-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model project-baize/baize-v2-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-12T16:13:42.802156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of project-baize/baize-v2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model project-baize/baize-v2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T16:13:42.802156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of project-baize/baize-v2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model project-baize/baize-v2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T16:13:42.802156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of project-baize/baize-v2-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model project-baize/baize-v2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T16:13:42.802156(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
fed67825ab85ffbde7344cf5fe2774acaa1c6845
# Dataset Card for Evaluation run of project-baize/baize-v2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/project-baize/baize-v2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [project-baize/baize-v2-7b](https://huggingface.co/project-baize/baize-v2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_project-baize__baize-v2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T20:41:40.256040](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-v2-7b/blob/main/results_2023-10-17T20-41-40.256040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893118995, "f1": 0.05669882550335583, "f1_stderr": 0.0013268477906001916, "acc": 0.37641345330495407, "acc_stderr": 0.009122223164597095 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893118995, "f1": 0.05669882550335583, "f1_stderr": 0.0013268477906001916 }, "harness|gsm8k|5": { "acc": 0.04169825625473844, "acc_stderr": 0.005506205058175759 }, "harness|winogrande|5": { "acc": 0.7111286503551697, "acc_stderr": 0.01273824127101843 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_project-baize__baize-v2-7b
[ "region:us" ]
2023-08-18T10:43:47+00:00
{"pretty_name": "Evaluation run of project-baize/baize-v2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [project-baize/baize-v2-7b](https://huggingface.co/project-baize/baize-v2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_project-baize__baize-v2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T20:41:40.256040](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-v2-7b/blob/main/results_2023-10-17T20-41-40.256040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118995,\n \"f1\": 0.05669882550335583,\n \"f1_stderr\": 0.0013268477906001916,\n \"acc\": 0.37641345330495407,\n \"acc_stderr\": 0.009122223164597095\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118995,\n \"f1\": 0.05669882550335583,\n \"f1_stderr\": 0.0013268477906001916\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04169825625473844,\n \"acc_stderr\": 0.005506205058175759\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7111286503551697,\n \"acc_stderr\": 0.01273824127101843\n }\n}\n```", "repo_url": "https://huggingface.co/project-baize/baize-v2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T20_41_40.256040", "path": ["**/details_harness|drop|3_2023-10-17T20-41-40.256040.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T20-41-40.256040.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T20_41_40.256040", "path": ["**/details_harness|gsm8k|5_2023-10-17T20-41-40.256040.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T20-41-40.256040.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:24:12.338026.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:24:12.338026.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:24:12.338026.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T20_41_40.256040", "path": ["**/details_harness|winogrande|5_2023-10-17T20-41-40.256040.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T20-41-40.256040.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_24_12.338026", "path": ["results_2023-07-19T16:24:12.338026.parquet"]}, {"split": "2023_10_17T20_41_40.256040", "path": ["results_2023-10-17T20-41-40.256040.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T20-41-40.256040.parquet"]}]}]}
2023-10-17T19:41:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of project-baize/baize-v2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model project-baize/baize-v2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T20:41:40.256040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of project-baize/baize-v2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model project-baize/baize-v2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T20:41:40.256040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of project-baize/baize-v2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model project-baize/baize-v2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T20:41:40.256040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of project-baize/baize-v2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model project-baize/baize-v2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T20:41:40.256040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
41e27783fe1d9401359270476a531cedc95c6a3e
# Dataset Card for Evaluation run of jerryjalapeno/nart-100k-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jerryjalapeno/nart-100k-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jerryjalapeno/nart-100k-7b](https://huggingface.co/jerryjalapeno/nart-100k-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T00:09:38.629020](https://huggingface.co/datasets/open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b/blob/main/results_2023-09-23T00-09-38.629020.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.09867869127516779, "em_stderr": 0.003054155613095924, "f1": 0.1502359479865761, "f1_stderr": 0.0031707924833711204, "acc": 0.3702237889195194, "acc_stderr": 0.008962759297749477 }, "harness|drop|3": { "em": 0.09867869127516779, "em_stderr": 0.003054155613095924, "f1": 0.1502359479865761, "f1_stderr": 0.0031707924833711204 }, "harness|gsm8k|5": { "acc": 0.0356330553449583, "acc_stderr": 0.00510610785374419 }, "harness|winogrande|5": { "acc": 0.7048145224940805, "acc_stderr": 0.012819410741754763 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b
[ "region:us" ]
2023-08-18T10:43:56+00:00
{"pretty_name": "Evaluation run of jerryjalapeno/nart-100k-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jerryjalapeno/nart-100k-7b](https://huggingface.co/jerryjalapeno/nart-100k-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T00:09:38.629020](https://huggingface.co/datasets/open-llm-leaderboard/details_jerryjalapeno__nart-100k-7b/blob/main/results_2023-09-23T00-09-38.629020.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.09867869127516779,\n \"em_stderr\": 0.003054155613095924,\n \"f1\": 0.1502359479865761,\n \"f1_stderr\": 0.0031707924833711204,\n \"acc\": 0.3702237889195194,\n \"acc_stderr\": 0.008962759297749477\n },\n \"harness|drop|3\": {\n \"em\": 0.09867869127516779,\n \"em_stderr\": 0.003054155613095924,\n \"f1\": 0.1502359479865761,\n \"f1_stderr\": 0.0031707924833711204\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \"acc_stderr\": 0.00510610785374419\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754763\n }\n}\n```", "repo_url": "https://huggingface.co/jerryjalapeno/nart-100k-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T00_09_38.629020", "path": ["**/details_harness|drop|3_2023-09-23T00-09-38.629020.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T00-09-38.629020.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T00_09_38.629020", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-09-38.629020.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-09-38.629020.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:14:45.628566.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:14:45.628566.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T00_09_38.629020", "path": ["**/details_harness|winogrande|5_2023-09-23T00-09-38.629020.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T00-09-38.629020.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_14_45.628566", "path": ["results_2023-07-24T11:14:45.628566.parquet"]}, {"split": "2023_09_23T00_09_38.629020", "path": ["results_2023-09-23T00-09-38.629020.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T00-09-38.629020.parquet"]}]}]}
2023-09-22T23:09:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jerryjalapeno/nart-100k-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jerryjalapeno/nart-100k-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T00:09:38.629020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jerryjalapeno/nart-100k-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jerryjalapeno/nart-100k-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:09:38.629020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jerryjalapeno/nart-100k-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jerryjalapeno/nart-100k-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:09:38.629020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jerryjalapeno/nart-100k-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jerryjalapeno/nart-100k-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T00:09:38.629020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9dd69a6cd8bd14ec5a9ed0f7b3d4f262e376cdcd
# Dataset Card for Evaluation run of MrNJK/gpt2-xl-sft ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/MrNJK/gpt2-xl-sft - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [MrNJK/gpt2-xl-sft](https://huggingface.co/MrNJK/gpt2-xl-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MrNJK__gpt2-xl-sft", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T20:10:52.677287](https://huggingface.co/datasets/open-llm-leaderboard/details_MrNJK__gpt2-xl-sft/blob/main/results_2023-09-17T20-10-52.677287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001572986577181208, "em_stderr": 0.000405845113241776, "f1": 0.053466862416107416, "f1_stderr": 0.0012595479932490756, "acc": 0.28161237645653686, "acc_stderr": 0.00817723914058038 }, "harness|drop|3": { "em": 0.001572986577181208, "em_stderr": 0.000405845113241776, "f1": 0.053466862416107416, "f1_stderr": 0.0012595479932490756 }, "harness|gsm8k|5": { "acc": 0.0075815011372251705, "acc_stderr": 0.0023892815120772075 }, "harness|winogrande|5": { "acc": 0.5556432517758485, "acc_stderr": 0.013965196769083553 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_MrNJK__gpt2-xl-sft
[ "region:us" ]
2023-08-18T10:44:05+00:00
{"pretty_name": "Evaluation run of MrNJK/gpt2-xl-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [MrNJK/gpt2-xl-sft](https://huggingface.co/MrNJK/gpt2-xl-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MrNJK__gpt2-xl-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T20:10:52.677287](https://huggingface.co/datasets/open-llm-leaderboard/details_MrNJK__gpt2-xl-sft/blob/main/results_2023-09-17T20-10-52.677287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241776,\n \"f1\": 0.053466862416107416,\n \"f1_stderr\": 0.0012595479932490756,\n \"acc\": 0.28161237645653686,\n \"acc_stderr\": 0.00817723914058038\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241776,\n \"f1\": 0.053466862416107416,\n \"f1_stderr\": 0.0012595479932490756\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.0023892815120772075\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5556432517758485,\n \"acc_stderr\": 0.013965196769083553\n }\n}\n```", "repo_url": "https://huggingface.co/MrNJK/gpt2-xl-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T20_10_52.677287", "path": ["**/details_harness|drop|3_2023-09-17T20-10-52.677287.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T20-10-52.677287.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T20_10_52.677287", "path": ["**/details_harness|gsm8k|5_2023-09-17T20-10-52.677287.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T20-10-52.677287.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:21:02.216696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:21:02.216696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T20_10_52.677287", "path": ["**/details_harness|winogrande|5_2023-09-17T20-10-52.677287.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T20-10-52.677287.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T09_21_02.216696", "path": ["results_2023-08-09T09:21:02.216696.parquet"]}, {"split": "2023_09_17T20_10_52.677287", "path": ["results_2023-09-17T20-10-52.677287.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T20-10-52.677287.parquet"]}]}]}
2023-09-17T19:11:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of MrNJK/gpt2-xl-sft ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model MrNJK/gpt2-xl-sft on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T20:10:52.677287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of MrNJK/gpt2-xl-sft", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MrNJK/gpt2-xl-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T20:10:52.677287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of MrNJK/gpt2-xl-sft", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MrNJK/gpt2-xl-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T20:10:52.677287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MrNJK/gpt2-xl-sft## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MrNJK/gpt2-xl-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T20:10:52.677287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
72b82e7bdebd58d8b14cc101516de745fe656131
# Dataset Card for Evaluation run of Vmware/open-llama-7b-v2-open-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Vmware/open-llama-7b-v2-open-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Vmware/open-llama-7b-v2-open-instruct](https://huggingface.co/Vmware/open-llama-7b-v2-open-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Vmware__open-llama-7b-v2-open-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T14:52:40.191492](https://huggingface.co/datasets/open-llm-leaderboard/details_Vmware__open-llama-7b-v2-open-instruct/blob/main/results_2023-09-22T14-52-40.191492.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.19966442953020133, "em_stderr": 0.0040937947224084096, "f1": 0.2588475251677862, "f1_stderr": 0.004161644087840314, "acc": 0.3587752434966338, "acc_stderr": 0.010343619065437147 }, "harness|drop|3": { "em": 0.19966442953020133, "em_stderr": 0.0040937947224084096, "f1": 0.2588475251677862, "f1_stderr": 0.004161644087840314 }, "harness|gsm8k|5": { "acc": 0.07429871114480667, "acc_stderr": 0.007223844172845568 }, "harness|winogrande|5": { "acc": 0.6432517758484609, "acc_stderr": 0.013463393958028725 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Vmware__open-llama-7b-v2-open-instruct
[ "region:us" ]
2023-08-18T10:44:13+00:00
{"pretty_name": "Evaluation run of Vmware/open-llama-7b-v2-open-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [Vmware/open-llama-7b-v2-open-instruct](https://huggingface.co/Vmware/open-llama-7b-v2-open-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Vmware__open-llama-7b-v2-open-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T14:52:40.191492](https://huggingface.co/datasets/open-llm-leaderboard/details_Vmware__open-llama-7b-v2-open-instruct/blob/main/results_2023-09-22T14-52-40.191492.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19966442953020133,\n \"em_stderr\": 0.0040937947224084096,\n \"f1\": 0.2588475251677862,\n \"f1_stderr\": 0.004161644087840314,\n \"acc\": 0.3587752434966338,\n \"acc_stderr\": 0.010343619065437147\n },\n \"harness|drop|3\": {\n \"em\": 0.19966442953020133,\n \"em_stderr\": 0.0040937947224084096,\n \"f1\": 0.2588475251677862,\n \"f1_stderr\": 0.004161644087840314\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07429871114480667,\n \"acc_stderr\": 0.007223844172845568\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6432517758484609,\n \"acc_stderr\": 0.013463393958028725\n }\n}\n```", "repo_url": "https://huggingface.co/Vmware/open-llama-7b-v2-open-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T12_17_47.553479", "path": ["**/details_harness|drop|3_2023-09-16T12-17-47.553479.parquet"]}, {"split": "2023_09_22T14_52_40.191492", "path": ["**/details_harness|drop|3_2023-09-22T14-52-40.191492.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T14-52-40.191492.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T12_17_47.553479", "path": ["**/details_harness|gsm8k|5_2023-09-16T12-17-47.553479.parquet"]}, {"split": "2023_09_22T14_52_40.191492", "path": ["**/details_harness|gsm8k|5_2023-09-22T14-52-40.191492.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T14-52-40.191492.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:20.574844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:27:20.574844.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:27:20.574844.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T12_17_47.553479", "path": ["**/details_harness|winogrande|5_2023-09-16T12-17-47.553479.parquet"]}, {"split": "2023_09_22T14_52_40.191492", "path": ["**/details_harness|winogrande|5_2023-09-22T14-52-40.191492.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T14-52-40.191492.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_27_20.574844", "path": ["results_2023-07-19T16:27:20.574844.parquet"]}, {"split": "2023_09_16T12_17_47.553479", "path": ["results_2023-09-16T12-17-47.553479.parquet"]}, {"split": "2023_09_22T14_52_40.191492", "path": ["results_2023-09-22T14-52-40.191492.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T14-52-40.191492.parquet"]}]}]}
2023-09-22T13:52:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Vmware/open-llama-7b-v2-open-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Vmware/open-llama-7b-v2-open-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T14:52:40.191492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Vmware/open-llama-7b-v2-open-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Vmware/open-llama-7b-v2-open-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T14:52:40.191492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Vmware/open-llama-7b-v2-open-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Vmware/open-llama-7b-v2-open-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T14:52:40.191492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Vmware/open-llama-7b-v2-open-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Vmware/open-llama-7b-v2-open-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T14:52:40.191492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
943e547705dc65997b074d0467599b02fa102629
# Dataset Card for Evaluation run of abhishek/llama2guanacotest ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/abhishek/llama2guanacotest - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [abhishek/llama2guanacotest](https://huggingface.co/abhishek/llama2guanacotest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abhishek__llama2guanacotest", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T17:34:42.809014](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__llama2guanacotest/blob/main/results_2023-09-22T17-34-42.809014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.1018246644295302, "em_stderr": 0.0030970392367407284, "f1": 0.15182571308724796, "f1_stderr": 0.0032356577343186617, "acc": 0.42458141676534983, "acc_stderr": 0.010661835808025592 }, "harness|drop|3": { "em": 0.1018246644295302, "em_stderr": 0.0030970392367407284, "f1": 0.15182571308724796, "f1_stderr": 0.0032356577343186617 }, "harness|gsm8k|5": { "acc": 0.11751326762699014, "acc_stderr": 0.008870331256489988 }, "harness|winogrande|5": { "acc": 0.7316495659037096, "acc_stderr": 0.012453340359561195 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_abhishek__llama2guanacotest
[ "region:us" ]
2023-08-18T10:44:21+00:00
{"pretty_name": "Evaluation run of abhishek/llama2guanacotest", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhishek/llama2guanacotest](https://huggingface.co/abhishek/llama2guanacotest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishek__llama2guanacotest\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T17:34:42.809014](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__llama2guanacotest/blob/main/results_2023-09-22T17-34-42.809014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1018246644295302,\n \"em_stderr\": 0.0030970392367407284,\n \"f1\": 0.15182571308724796,\n \"f1_stderr\": 0.0032356577343186617,\n \"acc\": 0.42458141676534983,\n \"acc_stderr\": 0.010661835808025592\n },\n \"harness|drop|3\": {\n \"em\": 0.1018246644295302,\n \"em_stderr\": 0.0030970392367407284,\n \"f1\": 0.15182571308724796,\n \"f1_stderr\": 0.0032356577343186617\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \"acc_stderr\": 0.008870331256489988\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n }\n}\n```", "repo_url": "https://huggingface.co/abhishek/llama2guanacotest", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|arc:challenge|25_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T17_34_42.809014", "path": ["**/details_harness|drop|3_2023-09-22T17-34-42.809014.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T17-34-42.809014.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T17_34_42.809014", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-34-42.809014.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-34-42.809014.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hellaswag|10_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T13:26:15.590917.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T13:26:15.590917.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T13:26:15.590917.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T17_34_42.809014", "path": ["**/details_harness|winogrande|5_2023-09-22T17-34-42.809014.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T17-34-42.809014.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T13_26_15.590917", "path": ["results_2023-08-17T13:26:15.590917.parquet"]}, {"split": "2023_09_22T17_34_42.809014", "path": ["results_2023-09-22T17-34-42.809014.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T17-34-42.809014.parquet"]}]}]}
2023-09-22T16:34:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of abhishek/llama2guanacotest ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model abhishek/llama2guanacotest on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T17:34:42.809014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of abhishek/llama2guanacotest", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model abhishek/llama2guanacotest on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:34:42.809014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of abhishek/llama2guanacotest", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model abhishek/llama2guanacotest on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T17:34:42.809014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abhishek/llama2guanacotest## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model abhishek/llama2guanacotest on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T17:34:42.809014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cb40ea1c375d2ce1cd89071ecb598cd737913deb
# Dataset Card for Evaluation run of openlm-research/open_llama_3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openlm-research/open_llama_3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openlm-research/open_llama_3b](https://huggingface.co/openlm-research/open_llama_3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openlm-research__open_llama_3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T03:50:40.523576](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b/blob/main/results_2023-10-18T03-50-40.523576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0005243288590604027, "em_stderr": 0.00023443780464835776, "f1": 0.050632340604026965, "f1_stderr": 0.001271781500579302, "acc": 0.32587350322198844, "acc_stderr": 0.00764164157289629 }, "harness|drop|3": { "em": 0.0005243288590604027, "em_stderr": 0.00023443780464835776, "f1": 0.050632340604026965, "f1_stderr": 0.001271781500579302 }, "harness|gsm8k|5": { "acc": 0.004548900682335102, "acc_stderr": 0.0018535550440036204 }, "harness|winogrande|5": { "acc": 0.6471981057616417, "acc_stderr": 0.01342972810178896 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openlm-research__open_llama_3b
[ "region:us" ]
2023-08-18T10:44:30+00:00
{"pretty_name": "Evaluation run of openlm-research/open_llama_3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [openlm-research/open_llama_3b](https://huggingface.co/openlm-research/open_llama_3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openlm-research__open_llama_3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T03:50:40.523576](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b/blob/main/results_2023-10-18T03-50-40.523576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464835776,\n \"f1\": 0.050632340604026965,\n \"f1_stderr\": 0.001271781500579302,\n \"acc\": 0.32587350322198844,\n \"acc_stderr\": 0.00764164157289629\n },\n \"harness|drop|3\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464835776,\n \"f1\": 0.050632340604026965,\n \"f1_stderr\": 0.001271781500579302\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.01342972810178896\n }\n}\n```", "repo_url": "https://huggingface.co/openlm-research/open_llama_3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T03_50_40.523576", "path": ["**/details_harness|drop|3_2023-10-18T03-50-40.523576.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T03-50-40.523576.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T03_50_40.523576", "path": ["**/details_harness|gsm8k|5_2023-10-18T03-50-40.523576.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T03-50-40.523576.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:49.433588.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:43:17.176281.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:43:17.176281.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T03_50_40.523576", "path": ["**/details_harness|winogrande|5_2023-10-18T03-50-40.523576.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T03-50-40.523576.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:management|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:virology|5_2023-08-28T14:52:20.646698.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:31:38.653587.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet", "**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:management|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:virology|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T14_52_20.646698", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-29T12:36:41.239310.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_22_49.433588", "path": ["results_2023-07-18T11:22:49.433588.parquet"]}, {"split": "2023_07_19T10_43_17.176281", "path": ["results_2023-07-19T10:43:17.176281.parquet"]}, {"split": "2023_08_28T14_52_20.646698", "path": ["results_2023-08-28T14:52:20.646698.parquet"]}, {"split": "2023_08_28T20_31_38.653587", "path": ["results_2023-08-28T20:31:38.653587.parquet"]}, {"split": "2023_08_29T12_36_41.239310", "path": ["results_2023-08-29T12:36:41.239310.parquet"]}, {"split": "2023_10_18T03_50_40.523576", "path": ["results_2023-10-18T03-50-40.523576.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T03-50-40.523576.parquet"]}]}]}
2023-10-18T02:50:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openlm-research/open_llama_3b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openlm-research/open_llama_3b on the Open LLM Leaderboard. The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T03:50:40.523576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openlm-research/open_llama_3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T03:50:40.523576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openlm-research/open_llama_3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T03:50:40.523576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openlm-research/open_llama_3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T03:50:40.523576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e8a70f11fc65a60367bb7d7491a83e3553ca0fa2
# Dataset Card for Evaluation run of openlm-research/open_llama_3b_v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openlm-research/open_llama_3b_v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openlm-research/open_llama_3b_v2](https://huggingface.co/openlm-research/open_llama_3b_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openlm-research__open_llama_3b_v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T11:22:56.677003](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b_v2/blob/main/results_2023-10-15T11-22-56.677003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857095, "f1": 0.05134962248322172, "f1_stderr": 0.0012730168443049574, "acc": 0.3395923103113801, "acc_stderr": 0.007914879526646601 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.0003476179896857095, "f1": 0.05134962248322172, "f1_stderr": 0.0012730168443049574 }, "harness|gsm8k|5": { "acc": 0.009097801364670205, "acc_stderr": 0.002615326510775673 }, "harness|winogrande|5": { "acc": 0.67008681925809, "acc_stderr": 0.013214432542517527 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openlm-research__open_llama_3b_v2
[ "region:us" ]
2023-08-18T10:44:40+00:00
{"pretty_name": "Evaluation run of openlm-research/open_llama_3b_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [openlm-research/open_llama_3b_v2](https://huggingface.co/openlm-research/open_llama_3b_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openlm-research__open_llama_3b_v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T11:22:56.677003](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b_v2/blob/main/results_2023-10-15T11-22-56.677003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05134962248322172,\n \"f1_stderr\": 0.0012730168443049574,\n \"acc\": 0.3395923103113801,\n \"acc_stderr\": 0.007914879526646601\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05134962248322172,\n \"f1_stderr\": 0.0012730168443049574\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.002615326510775673\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.67008681925809,\n \"acc_stderr\": 0.013214432542517527\n }\n}\n```", "repo_url": "https://huggingface.co/openlm-research/open_llama_3b_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T11_22_56.677003", "path": ["**/details_harness|drop|3_2023-10-15T11-22-56.677003.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T11-22-56.677003.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T11_22_56.677003", "path": ["**/details_harness|gsm8k|5_2023-10-15T11-22-56.677003.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T11-22-56.677003.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:28:09.665576.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:28:09.665576.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T11_22_56.677003", "path": ["**/details_harness|winogrande|5_2023-10-15T11-22-56.677003.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T11-22-56.677003.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T10_28_09.665576", "path": ["results_2023-07-24T10:28:09.665576.parquet"]}, {"split": "2023_10_15T11_22_56.677003", "path": ["results_2023-10-15T11-22-56.677003.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T11-22-56.677003.parquet"]}]}]}
2023-10-15T10:23:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openlm-research/open_llama_3b_v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openlm-research/open_llama_3b_v2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T11:22:56.677003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openlm-research/open_llama_3b_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_3b_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T11:22:56.677003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openlm-research/open_llama_3b_v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_3b_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T11:22:56.677003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openlm-research/open_llama_3b_v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_3b_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T11:22:56.677003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d504c169cbd950c856031555558c90fce6fde8d1
# Dataset Card for Evaluation run of openlm-research/open_llama_13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openlm-research/open_llama_13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openlm-research/open_llama_13b](https://huggingface.co/openlm-research/open_llama_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openlm-research__open_llama_13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T17:56:49.237621](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_13b/blob/main/results_2023-10-15T17-56-49.237621.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.00036305608931192504, "f1": 0.059914010067114144, "f1_stderr": 0.0014216209947324643, "acc": 0.3766001485184359, "acc_stderr": 0.008751247780672124 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.00036305608931192504, "f1": 0.059914010067114144, "f1_stderr": 0.0014216209947324643 }, "harness|gsm8k|5": { "acc": 0.032600454890068235, "acc_stderr": 0.004891669021939565 }, "harness|winogrande|5": { "acc": 0.7205998421468035, "acc_stderr": 0.012610826539404684 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openlm-research__open_llama_13b
[ "region:us" ]
2023-08-18T10:44:48+00:00
{"pretty_name": "Evaluation run of openlm-research/open_llama_13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [openlm-research/open_llama_13b](https://huggingface.co/openlm-research/open_llama_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openlm-research__open_llama_13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T17:56:49.237621](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_13b/blob/main/results_2023-10-15T17-56-49.237621.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931192504,\n \"f1\": 0.059914010067114144,\n \"f1_stderr\": 0.0014216209947324643,\n \"acc\": 0.3766001485184359,\n \"acc_stderr\": 0.008751247780672124\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931192504,\n \"f1\": 0.059914010067114144,\n \"f1_stderr\": 0.0014216209947324643\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.032600454890068235,\n \"acc_stderr\": 0.004891669021939565\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404684\n }\n}\n```", "repo_url": "https://huggingface.co/openlm-research/open_llama_13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T10_40_56.309063", "path": ["**/details_harness|drop|3_2023-10-13T10-40-56.309063.parquet"]}, {"split": "2023_10_15T17_56_49.237621", "path": ["**/details_harness|drop|3_2023-10-15T17-56-49.237621.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T17-56-49.237621.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T10_40_56.309063", "path": ["**/details_harness|gsm8k|5_2023-10-13T10-40-56.309063.parquet"]}, {"split": "2023_10_15T17_56_49.237621", "path": ["**/details_harness|gsm8k|5_2023-10-15T17-56-49.237621.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T17-56-49.237621.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:02:18.035133.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:23:47.053275.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:23:47.053275.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:23:47.053275.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T10_40_56.309063", "path": ["**/details_harness|winogrande|5_2023-10-13T10-40-56.309063.parquet"]}, {"split": "2023_10_15T17_56_49.237621", "path": ["**/details_harness|winogrande|5_2023-10-15T17-56-49.237621.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T17-56-49.237621.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:28:46.796223.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_28_46.796223", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:28:46.796223.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:28:46.796223.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T10_02_18.035133", "path": ["results_2023-07-20T10:02:18.035133.parquet"]}, {"split": "2023_07_24T14_23_47.053275", "path": ["results_2023-07-24T14:23:47.053275.parquet"]}, {"split": "2023_08_28T20_28_46.796223", "path": ["results_2023-08-28T20:28:46.796223.parquet"]}, {"split": "2023_10_13T10_40_56.309063", "path": ["results_2023-10-13T10-40-56.309063.parquet"]}, {"split": "2023_10_15T17_56_49.237621", "path": ["results_2023-10-15T17-56-49.237621.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T17-56-49.237621.parquet"]}]}]}
2023-10-15T16:56:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openlm-research/open_llama_13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openlm-research/open_llama_13b on the Open LLM Leaderboard. The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T17:56:49.237621(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openlm-research/open_llama_13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T17:56:49.237621(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openlm-research/open_llama_13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T17:56:49.237621(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openlm-research/open_llama_13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T17:56:49.237621(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e9794223cf0bb9b5a37ab0de808db6ed41125226
# Dataset Card for Evaluation run of openlm-research/open_llama_7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/openlm-research/open_llama_7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [openlm-research/open_llama_7b](https://huggingface.co/openlm-research/open_llama_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_openlm-research__open_llama_7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T17:26:48.856271](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_7b/blob/main/results_2023-10-18T17-26-48.856271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0008389261744966443, "em_stderr": 0.00029649629898012564, "f1": 0.054966442953020285, "f1_stderr": 0.00134099148142866, "acc": 0.3477395817189483, "acc_stderr": 0.008281452365035358 }, "harness|drop|3": { "em": 0.0008389261744966443, "em_stderr": 0.00029649629898012564, "f1": 0.054966442953020285, "f1_stderr": 0.00134099148142866 }, "harness|gsm8k|5": { "acc": 0.01592115238817286, "acc_stderr": 0.0034478192723890037 }, "harness|winogrande|5": { "acc": 0.6795580110497238, "acc_stderr": 0.013115085457681712 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_openlm-research__open_llama_7b
[ "region:us" ]
2023-08-18T10:45:06+00:00
{"pretty_name": "Evaluation run of openlm-research/open_llama_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [openlm-research/open_llama_7b](https://huggingface.co/openlm-research/open_llama_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openlm-research__open_llama_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T17:26:48.856271](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_7b/blob/main/results_2023-10-18T17-26-48.856271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012564,\n \"f1\": 0.054966442953020285,\n \"f1_stderr\": 0.00134099148142866,\n \"acc\": 0.3477395817189483,\n \"acc_stderr\": 0.008281452365035358\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012564,\n \"f1\": 0.054966442953020285,\n \"f1_stderr\": 0.00134099148142866\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \"acc_stderr\": 0.0034478192723890037\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6795580110497238,\n \"acc_stderr\": 0.013115085457681712\n }\n}\n```", "repo_url": "https://huggingface.co/openlm-research/open_llama_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T17_12_53.113186", "path": ["**/details_harness|drop|3_2023-10-16T17-12-53.113186.parquet"]}, {"split": "2023_10_18T17_26_48.856271", "path": ["**/details_harness|drop|3_2023-10-18T17-26-48.856271.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T17-26-48.856271.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T17_12_53.113186", "path": ["**/details_harness|gsm8k|5_2023-10-16T17-12-53.113186.parquet"]}, {"split": "2023_10_18T17_26_48.856271", "path": ["**/details_harness|gsm8k|5_2023-10-18T17-26-48.856271.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T17-26-48.856271.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:27:20.581564.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:52:35.127282.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:52:35.127282.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T17_12_53.113186", "path": ["**/details_harness|winogrande|5_2023-10-16T17-12-53.113186.parquet"]}, {"split": "2023_10_18T17_26_48.856271", "path": ["**/details_harness|winogrande|5_2023-10-18T17-26-48.856271.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T17-26-48.856271.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T12_27_20.581564", "path": ["results_2023-07-18T12:27:20.581564.parquet"]}, {"split": "2023_07_19T10_52_35.127282", "path": ["results_2023-07-19T10:52:35.127282.parquet"]}, {"split": "2023_10_16T17_12_53.113186", "path": ["results_2023-10-16T17-12-53.113186.parquet"]}, {"split": "2023_10_18T17_26_48.856271", "path": ["results_2023-10-18T17-26-48.856271.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T17-26-48.856271.parquet"]}]}]}
2023-10-18T16:26:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of openlm-research/open_llama_7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model openlm-research/open_llama_7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T17:26:48.856271(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of openlm-research/open_llama_7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T17:26:48.856271(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of openlm-research/open_llama_7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T17:26:48.856271(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openlm-research/open_llama_7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openlm-research/open_llama_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T17:26:48.856271(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]