sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
dc2a7bd03f1acfde5b88c2b6becf705f808ce1bc
# Dataset Card for Evaluation run of MayaPH/FinOPT-Franklin ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/MayaPH/FinOPT-Franklin - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [MayaPH/FinOPT-Franklin](https://huggingface.co/MayaPH/FinOPT-Franklin) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MayaPH__FinOPT-Franklin", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T03:49:57.107802](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Franklin/blob/main/results_2023-10-18T03-49-57.107802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0005243288590604027, "em_stderr": 0.00023443780464837331, "f1": 0.0010171979865771813, "f1_stderr": 0.0002699153689755448, "acc": 0.2525651144435675, "acc_stderr": 0.007025872980895256 }, "harness|drop|3": { "em": 0.0005243288590604027, "em_stderr": 0.00023443780464837331, "f1": 0.0010171979865771813, "f1_stderr": 0.0002699153689755448 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.505130228887135, "acc_stderr": 0.014051745961790513 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_MayaPH__FinOPT-Franklin
[ "region:us" ]
2023-08-17T22:55:42+00:00
{"pretty_name": "Evaluation run of MayaPH/FinOPT-Franklin", "dataset_summary": "Dataset automatically created during the evaluation run of model [MayaPH/FinOPT-Franklin](https://huggingface.co/MayaPH/FinOPT-Franklin) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__FinOPT-Franklin\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T03:49:57.107802](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Franklin/blob/main/results_2023-10-18T03-49-57.107802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464837331,\n \"f1\": 0.0010171979865771813,\n \"f1_stderr\": 0.0002699153689755448,\n \"acc\": 0.2525651144435675,\n \"acc_stderr\": 0.007025872980895256\n },\n \"harness|drop|3\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464837331,\n \"f1\": 0.0010171979865771813,\n \"f1_stderr\": 0.0002699153689755448\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790513\n }\n}\n```", "repo_url": "https://huggingface.co/MayaPH/FinOPT-Franklin", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|arc:challenge|25_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T03_49_57.107802", "path": ["**/details_harness|drop|3_2023-10-18T03-49-57.107802.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T03-49-57.107802.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T03_49_57.107802", "path": ["**/details_harness|gsm8k|5_2023-10-18T03-49-57.107802.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T03-49-57.107802.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hellaswag|10_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T12:10:37.381661.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T12:10:37.381661.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T03_49_57.107802", "path": ["**/details_harness|winogrande|5_2023-10-18T03-49-57.107802.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T03-49-57.107802.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T12_10_37.381661", "path": ["results_2023-07-19T12:10:37.381661.parquet"]}, {"split": "2023_10_18T03_49_57.107802", "path": ["results_2023-10-18T03-49-57.107802.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T03-49-57.107802.parquet"]}]}]}
2023-10-18T02:50:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of MayaPH/FinOPT-Franklin ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model MayaPH/FinOPT-Franklin on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T03:49:57.107802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of MayaPH/FinOPT-Franklin", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Franklin on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T03:49:57.107802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of MayaPH/FinOPT-Franklin", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Franklin on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T03:49:57.107802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MayaPH/FinOPT-Franklin## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Franklin on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T03:49:57.107802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ba642830e73387d92dab05f6636a484c3535c0a4
# Dataset Card for Evaluation run of MayaPH/FinOPT-Lincoln ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/MayaPH/FinOPT-Lincoln - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [MayaPH/FinOPT-Lincoln](https://huggingface.co/MayaPH/FinOPT-Lincoln) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MayaPH__FinOPT-Lincoln", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T01:56:21.119059](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Lincoln/blob/main/results_2023-10-18T01-56-21.119059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0010486577181208054, "em_stderr": 0.0003314581465219158, "f1": 0.007617449664429529, "f1_stderr": 0.0006036457063633518, "acc": 0.24861878453038674, "acc_stderr": 0.007026135605808221 }, "harness|drop|3": { "em": 0.0010486577181208054, "em_stderr": 0.0003314581465219158, "f1": 0.007617449664429529, "f1_stderr": 0.0006036457063633518 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4972375690607735, "acc_stderr": 0.014052271211616441 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_MayaPH__FinOPT-Lincoln
[ "region:us" ]
2023-08-17T22:55:51+00:00
{"pretty_name": "Evaluation run of MayaPH/FinOPT-Lincoln", "dataset_summary": "Dataset automatically created during the evaluation run of model [MayaPH/FinOPT-Lincoln](https://huggingface.co/MayaPH/FinOPT-Lincoln) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__FinOPT-Lincoln\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T01:56:21.119059](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Lincoln/blob/main/results_2023-10-18T01-56-21.119059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219158,\n \"f1\": 0.007617449664429529,\n \"f1_stderr\": 0.0006036457063633518,\n \"acc\": 0.24861878453038674,\n \"acc_stderr\": 0.007026135605808221\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219158,\n \"f1\": 0.007617449664429529,\n \"f1_stderr\": 0.0006036457063633518\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616441\n }\n}\n```", "repo_url": "https://huggingface.co/MayaPH/FinOPT-Lincoln", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T01_56_21.119059", "path": ["**/details_harness|drop|3_2023-10-18T01-56-21.119059.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T01-56-21.119059.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T01_56_21.119059", "path": ["**/details_harness|gsm8k|5_2023-10-18T01-56-21.119059.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T01-56-21.119059.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:38:32.628939.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:38:32.628939.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:38:32.628939.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T01_56_21.119059", "path": ["**/details_harness|winogrande|5_2023-10-18T01-56-21.119059.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T01-56-21.119059.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T11_38_32.628939", "path": ["results_2023-07-19T11:38:32.628939.parquet"]}, {"split": "2023_10_18T01_56_21.119059", "path": ["results_2023-10-18T01-56-21.119059.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T01-56-21.119059.parquet"]}]}]}
2023-10-18T00:56:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of MayaPH/FinOPT-Lincoln ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model MayaPH/FinOPT-Lincoln on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T01:56:21.119059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of MayaPH/FinOPT-Lincoln", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Lincoln on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T01:56:21.119059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of MayaPH/FinOPT-Lincoln", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Lincoln on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T01:56:21.119059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MayaPH/FinOPT-Lincoln## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Lincoln on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T01:56:21.119059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5a4564ab57b5a8c956da799d3229d9360e967394
# Dataset Card for Evaluation run of MayaPH/FinOPT-Washington ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/MayaPH/FinOPT-Washington - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [MayaPH/FinOPT-Washington](https://huggingface.co/MayaPH/FinOPT-Washington) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MayaPH__FinOPT-Washington", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T23:56:22.657574](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Washington/blob/main/results_2023-10-18T23-56-22.657574.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0017827181208053692, "em_stderr": 0.0004320097346039032, "f1": 0.010031459731543634, "f1_stderr": 0.0006658804659596376, "acc": 0.255327545382794, "acc_stderr": 0.0070246472681452 }, "harness|drop|3": { "em": 0.0017827181208053692, "em_stderr": 0.0004320097346039032, "f1": 0.010031459731543634, "f1_stderr": 0.0006658804659596376 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.510655090765588, "acc_stderr": 0.0140492945362904 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_MayaPH__FinOPT-Washington
[ "region:us" ]
2023-08-17T22:56:00+00:00
{"pretty_name": "Evaluation run of MayaPH/FinOPT-Washington", "dataset_summary": "Dataset automatically created during the evaluation run of model [MayaPH/FinOPT-Washington](https://huggingface.co/MayaPH/FinOPT-Washington) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__FinOPT-Washington\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T23:56:22.657574](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Washington/blob/main/results_2023-10-18T23-56-22.657574.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346039032,\n \"f1\": 0.010031459731543634,\n \"f1_stderr\": 0.0006658804659596376,\n \"acc\": 0.255327545382794,\n \"acc_stderr\": 0.0070246472681452\n },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346039032,\n \"f1\": 0.010031459731543634,\n \"f1_stderr\": 0.0006658804659596376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.0140492945362904\n }\n}\n```", "repo_url": "https://huggingface.co/MayaPH/FinOPT-Washington", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T23_56_22.657574", "path": ["**/details_harness|drop|3_2023-10-18T23-56-22.657574.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T23-56-22.657574.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T23_56_22.657574", "path": ["**/details_harness|gsm8k|5_2023-10-18T23-56-22.657574.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T23-56-22.657574.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:02.567190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:13:02.567190.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:13:02.567190.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T23_56_22.657574", "path": ["**/details_harness|winogrande|5_2023-10-18T23-56-22.657574.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T23-56-22.657574.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_13_02.567190", "path": ["results_2023-07-19T19:13:02.567190.parquet"]}, {"split": "2023_10_18T23_56_22.657574", "path": ["results_2023-10-18T23-56-22.657574.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T23-56-22.657574.parquet"]}]}]}
2023-10-18T22:56:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of MayaPH/FinOPT-Washington ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model MayaPH/FinOPT-Washington on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T23:56:22.657574(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of MayaPH/FinOPT-Washington", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Washington on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T23:56:22.657574(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of MayaPH/FinOPT-Washington", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Washington on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T23:56:22.657574(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MayaPH/FinOPT-Washington## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/FinOPT-Washington on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T23:56:22.657574(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e1f1b2ce1718ed90faf8b5d700b0abb9d04a0930
# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/MayaPH/GodziLLa-30B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [MayaPH/GodziLLa-30B](https://huggingface.co/MayaPH/GodziLLa-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_MayaPH__GodziLLa-30B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T01:20:37.554639](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa-30B/blob/main/results_2023-09-17T01-20-37.554639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.22808305369127516, "em_stderr": 0.004297060303049989, "f1": 0.34862416107382826, "f1_stderr": 0.004249472334452047, "acc": 0.3827162119062479, "acc_stderr": 0.006833824703926247 }, "harness|drop|3": { "em": 0.22808305369127516, "em_stderr": 0.004297060303049989, "f1": 0.34862416107382826, "f1_stderr": 0.004249472334452047 }, "harness|gsm8k|5": { "acc": 0.0037907505686125853, "acc_stderr": 0.0016927007401501802 }, "harness|winogrande|5": { "acc": 0.7616416732438832, "acc_stderr": 0.011974948667702313 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_MayaPH__GodziLLa-30B
[ "region:us" ]
2023-08-17T22:56:09+00:00
{"pretty_name": "Evaluation run of MayaPH/GodziLLa-30B", "dataset_summary": "Dataset automatically created during the evaluation run of model [MayaPH/GodziLLa-30B](https://huggingface.co/MayaPH/GodziLLa-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__GodziLLa-30B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T01:20:37.554639](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa-30B/blob/main/results_2023-09-17T01-20-37.554639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22808305369127516,\n \"em_stderr\": 0.004297060303049989,\n \"f1\": 0.34862416107382826,\n \"f1_stderr\": 0.004249472334452047,\n \"acc\": 0.3827162119062479,\n \"acc_stderr\": 0.006833824703926247\n },\n \"harness|drop|3\": {\n \"em\": 0.22808305369127516,\n \"em_stderr\": 0.004297060303049989,\n \"f1\": 0.34862416107382826,\n \"f1_stderr\": 0.004249472334452047\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501802\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702313\n }\n}\n```", "repo_url": "https://huggingface.co/MayaPH/GodziLLa-30B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T01_20_37.554639", "path": ["**/details_harness|drop|3_2023-09-17T01-20-37.554639.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T01-20-37.554639.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T01_20_37.554639", "path": ["**/details_harness|gsm8k|5_2023-09-17T01-20-37.554639.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T01-20-37.554639.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:21:46.977528.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:21:46.977528.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T01_20_37.554639", "path": ["**/details_harness|winogrande|5_2023-09-17T01-20-37.554639.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T01-20-37.554639.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_21_46.977528", "path": ["results_2023-07-19T22:21:46.977528.parquet"]}, {"split": "2023_09_17T01_20_37.554639", "path": ["results_2023-09-17T01-20-37.554639.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T01-20-37.554639.parquet"]}]}]}
2023-09-17T00:20:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T01:20:37.554639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T01:20:37.554639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T01:20:37.554639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T01:20:37.554639(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0f92a0e08e5eba33f1d3aa1e6c78782c28d7b882
# Dataset Card for Evaluation run of liuxiang886/llama2-70B-qlora-gpt4 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [liuxiang886/llama2-70B-qlora-gpt4](https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T18:00:05.987903](https://huggingface.co/datasets/open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4/blob/main/results_2023-09-17T18-00-05.987903.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4848993288590604, "em_stderr": 0.005118132215061967, "f1": 0.5715404781879219, "f1_stderr": 0.004685062097512246, "acc": 0.5587922375481174, "acc_stderr": 0.011536318547544595 }, "harness|drop|3": { "em": 0.4848993288590604, "em_stderr": 0.005118132215061967, "f1": 0.5715404781879219, "f1_stderr": 0.004685062097512246 }, "harness|gsm8k|5": { "acc": 0.288855193328279, "acc_stderr": 0.012484219800126664 }, "harness|winogrande|5": { "acc": 0.8287292817679558, "acc_stderr": 0.010588417294962526 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4
[ "region:us" ]
2023-08-17T22:56:18+00:00
{"pretty_name": "Evaluation run of liuxiang886/llama2-70B-qlora-gpt4", "dataset_summary": "Dataset automatically created during the evaluation run of model [liuxiang886/llama2-70B-qlora-gpt4](https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T18:00:05.987903](https://huggingface.co/datasets/open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4/blob/main/results_2023-09-17T18-00-05.987903.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4848993288590604,\n \"em_stderr\": 0.005118132215061967,\n \"f1\": 0.5715404781879219,\n \"f1_stderr\": 0.004685062097512246,\n \"acc\": 0.5587922375481174,\n \"acc_stderr\": 0.011536318547544595\n },\n \"harness|drop|3\": {\n \"em\": 0.4848993288590604,\n \"em_stderr\": 0.005118132215061967,\n \"f1\": 0.5715404781879219,\n \"f1_stderr\": 0.004685062097512246\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.288855193328279,\n \"acc_stderr\": 0.012484219800126664\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962526\n }\n}\n```", "repo_url": "https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T18_00_05.987903", "path": ["**/details_harness|drop|3_2023-09-17T18-00-05.987903.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T18-00-05.987903.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T18_00_05.987903", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-00-05.987903.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-00-05.987903.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:45:03.475580.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:45:03.475580.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T18_00_05.987903", "path": ["**/details_harness|winogrande|5_2023-09-17T18-00-05.987903.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T18-00-05.987903.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T20_45_03.475580", "path": ["results_2023-08-09T20:45:03.475580.parquet"]}, {"split": "2023_09_17T18_00_05.987903", "path": ["results_2023-09-17T18-00-05.987903.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T18-00-05.987903.parquet"]}]}]}
2023-09-17T17:00:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of liuxiang886/llama2-70B-qlora-gpt4 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model liuxiang886/llama2-70B-qlora-gpt4 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T18:00:05.987903(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of liuxiang886/llama2-70B-qlora-gpt4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model liuxiang886/llama2-70B-qlora-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T18:00:05.987903(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of liuxiang886/llama2-70B-qlora-gpt4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model liuxiang886/llama2-70B-qlora-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T18:00:05.987903(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of liuxiang886/llama2-70B-qlora-gpt4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model liuxiang886/llama2-70B-qlora-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T18:00:05.987903(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bad006decef1e96035a38ef381fe33ea8deae60d
# Dataset Card for Evaluation run of notstoic/PygmalionCoT-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/notstoic/PygmalionCoT-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [notstoic/PygmalionCoT-7b](https://huggingface.co/notstoic/PygmalionCoT-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_notstoic__PygmalionCoT-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T15:06:38.792335](https://huggingface.co/datasets/open-llm-leaderboard/details_notstoic__PygmalionCoT-7b/blob/main/results_2023-09-22T15-06-38.792335.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.12111996644295302, "em_stderr": 0.0033412757702121106, "f1": 0.17514471476510068, "f1_stderr": 0.0034689450739406216, "acc": 0.36081482886571287, "acc_stderr": 0.00895060187911282 }, "harness|drop|3": { "em": 0.12111996644295302, "em_stderr": 0.0033412757702121106, "f1": 0.17514471476510068, "f1_stderr": 0.0034689450739406216 }, "harness|gsm8k|5": { "acc": 0.032600454890068235, "acc_stderr": 0.004891669021939579 }, "harness|winogrande|5": { "acc": 0.6890292028413575, "acc_stderr": 0.01300953473628606 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_notstoic__PygmalionCoT-7b
[ "region:us" ]
2023-08-17T22:56:26+00:00
{"pretty_name": "Evaluation run of notstoic/PygmalionCoT-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [notstoic/PygmalionCoT-7b](https://huggingface.co/notstoic/PygmalionCoT-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_notstoic__PygmalionCoT-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T15:06:38.792335](https://huggingface.co/datasets/open-llm-leaderboard/details_notstoic__PygmalionCoT-7b/blob/main/results_2023-09-22T15-06-38.792335.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12111996644295302,\n \"em_stderr\": 0.0033412757702121106,\n \"f1\": 0.17514471476510068,\n \"f1_stderr\": 0.0034689450739406216,\n \"acc\": 0.36081482886571287,\n \"acc_stderr\": 0.00895060187911282\n },\n \"harness|drop|3\": {\n \"em\": 0.12111996644295302,\n \"em_stderr\": 0.0033412757702121106,\n \"f1\": 0.17514471476510068,\n \"f1_stderr\": 0.0034689450739406216\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.032600454890068235,\n \"acc_stderr\": 0.004891669021939579\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6890292028413575,\n \"acc_stderr\": 0.01300953473628606\n }\n}\n```", "repo_url": "https://huggingface.co/notstoic/PygmalionCoT-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T15_06_38.792335", "path": ["**/details_harness|drop|3_2023-09-22T15-06-38.792335.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T15-06-38.792335.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T15_06_38.792335", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-06-38.792335.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-06-38.792335.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:24:33.017908.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:24:33.017908.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T15_06_38.792335", "path": ["**/details_harness|winogrande|5_2023-09-22T15-06-38.792335.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T15-06-38.792335.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T12_24_33.017908", "path": ["results_2023-07-18T12:24:33.017908.parquet"]}, {"split": "2023_09_22T15_06_38.792335", "path": ["results_2023-09-22T15-06-38.792335.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T15-06-38.792335.parquet"]}]}]}
2023-09-22T14:06:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of notstoic/PygmalionCoT-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model notstoic/PygmalionCoT-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T15:06:38.792335(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of notstoic/PygmalionCoT-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model notstoic/PygmalionCoT-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:06:38.792335(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of notstoic/PygmalionCoT-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model notstoic/PygmalionCoT-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:06:38.792335(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of notstoic/PygmalionCoT-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model notstoic/PygmalionCoT-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T15:06:38.792335(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7ab4db06ba647f0f76cce8f7d270d72cfe029b03
# Dataset Card for Evaluation run of illuin/test-custom-llama ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/illuin/test-custom-llama - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [illuin/test-custom-llama](https://huggingface.co/illuin/test-custom-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_illuin__test-custom-llama", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T13:17:00.562267](https://huggingface.co/datasets/open-llm-leaderboard/details_illuin__test-custom-llama/blob/main/results_2023-10-27T13-17-00.562267.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004089765100671141, "em_stderr": 0.0006535802669912844, "f1": 0.06308619966442945, "f1_stderr": 0.0014549394005291911, "acc": 0.38039089908704843, "acc_stderr": 0.009010133138187597 }, "harness|drop|3": { "em": 0.004089765100671141, "em_stderr": 0.0006535802669912844, "f1": 0.06308619966442945, "f1_stderr": 0.0014549394005291911 }, "harness|gsm8k|5": { "acc": 0.0401819560272934, "acc_stderr": 0.005409439736970511 }, "harness|winogrande|5": { "acc": 0.7205998421468035, "acc_stderr": 0.012610826539404684 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_illuin__test-custom-llama
[ "region:us" ]
2023-08-17T22:56:35+00:00
{"pretty_name": "Evaluation run of illuin/test-custom-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [illuin/test-custom-llama](https://huggingface.co/illuin/test-custom-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_illuin__test-custom-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T13:17:00.562267](https://huggingface.co/datasets/open-llm-leaderboard/details_illuin__test-custom-llama/blob/main/results_2023-10-27T13-17-00.562267.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004089765100671141,\n \"em_stderr\": 0.0006535802669912844,\n \"f1\": 0.06308619966442945,\n \"f1_stderr\": 0.0014549394005291911,\n \"acc\": 0.38039089908704843,\n \"acc_stderr\": 0.009010133138187597\n },\n \"harness|drop|3\": {\n \"em\": 0.004089765100671141,\n \"em_stderr\": 0.0006535802669912844,\n \"f1\": 0.06308619966442945,\n \"f1_stderr\": 0.0014549394005291911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \"acc_stderr\": 0.005409439736970511\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404684\n }\n}\n```", "repo_url": "https://huggingface.co/illuin/test-custom-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T13_17_00.562267", "path": ["**/details_harness|drop|3_2023-10-27T13-17-00.562267.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T13-17-00.562267.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T13_17_00.562267", "path": ["**/details_harness|gsm8k|5_2023-10-27T13-17-00.562267.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T13-17-00.562267.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:12:39.825467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:12:39.825467.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:12:39.825467.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T13_17_00.562267", "path": ["**/details_harness|winogrande|5_2023-10-27T13-17-00.562267.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T13-17-00.562267.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T20_12_39.825467", "path": ["results_2023-07-19T20:12:39.825467.parquet"]}, {"split": "2023_10_27T13_17_00.562267", "path": ["results_2023-10-27T13-17-00.562267.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T13-17-00.562267.parquet"]}]}]}
2023-10-27T12:17:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of illuin/test-custom-llama ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model illuin/test-custom-llama on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T13:17:00.562267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of illuin/test-custom-llama", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model illuin/test-custom-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T13:17:00.562267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of illuin/test-custom-llama", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model illuin/test-custom-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T13:17:00.562267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of illuin/test-custom-llama## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model illuin/test-custom-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T13:17:00.562267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7c2b3fbaa42c2fd06215926fa49ee745a4db7eab
# Dataset Card for Evaluation run of Lazycuber/pyg-instruct-wizardlm ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lazycuber/pyg-instruct-wizardlm - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Lazycuber/pyg-instruct-wizardlm](https://huggingface.co/Lazycuber/pyg-instruct-wizardlm) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lazycuber__pyg-instruct-wizardlm", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T08:03:29.005419](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__pyg-instruct-wizardlm/blob/main/results_2023-10-28T08-03-29.005419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01971476510067114, "em_stderr": 0.0014236777096831824, "f1": 0.07215394295302006, "f1_stderr": 0.001870662901719372, "acc": 0.3264294001877723, "acc_stderr": 0.008481505569434104 }, "harness|drop|3": { "em": 0.01971476510067114, "em_stderr": 0.0014236777096831824, "f1": 0.07215394295302006, "f1_stderr": 0.001870662901719372 }, "harness|gsm8k|5": { "acc": 0.01592115238817286, "acc_stderr": 0.0034478192723889907 }, "harness|winogrande|5": { "acc": 0.6369376479873717, "acc_stderr": 0.01351519186647922 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Lazycuber__pyg-instruct-wizardlm
[ "region:us" ]
2023-08-17T22:56:44+00:00
{"pretty_name": "Evaluation run of Lazycuber/pyg-instruct-wizardlm", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lazycuber/pyg-instruct-wizardlm](https://huggingface.co/Lazycuber/pyg-instruct-wizardlm) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__pyg-instruct-wizardlm\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T08:03:29.005419](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__pyg-instruct-wizardlm/blob/main/results_2023-10-28T08-03-29.005419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01971476510067114,\n \"em_stderr\": 0.0014236777096831824,\n \"f1\": 0.07215394295302006,\n \"f1_stderr\": 0.001870662901719372,\n \"acc\": 0.3264294001877723,\n \"acc_stderr\": 0.008481505569434104\n },\n \"harness|drop|3\": {\n \"em\": 0.01971476510067114,\n \"em_stderr\": 0.0014236777096831824,\n \"f1\": 0.07215394295302006,\n \"f1_stderr\": 0.001870662901719372\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \"acc_stderr\": 0.0034478192723889907\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6369376479873717,\n \"acc_stderr\": 0.01351519186647922\n }\n}\n```", "repo_url": "https://huggingface.co/Lazycuber/pyg-instruct-wizardlm", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T08_03_29.005419", "path": ["**/details_harness|drop|3_2023-10-28T08-03-29.005419.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T08-03-29.005419.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T08_03_29.005419", "path": ["**/details_harness|gsm8k|5_2023-10-28T08-03-29.005419.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T08-03-29.005419.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:30:39.317119.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:30:39.317119.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:30:39.317119.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T08_03_29.005419", "path": ["**/details_harness|winogrande|5_2023-10-28T08-03-29.005419.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T08-03-29.005419.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T15_30_39.317119", "path": ["results_2023-07-24T15:30:39.317119.parquet"]}, {"split": "2023_10_28T08_03_29.005419", "path": ["results_2023-10-28T08-03-29.005419.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T08-03-29.005419.parquet"]}]}]}
2023-10-28T07:03:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lazycuber/pyg-instruct-wizardlm ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Lazycuber/pyg-instruct-wizardlm on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T08:03:29.005419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Lazycuber/pyg-instruct-wizardlm", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/pyg-instruct-wizardlm on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T08:03:29.005419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lazycuber/pyg-instruct-wizardlm", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/pyg-instruct-wizardlm on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T08:03:29.005419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lazycuber/pyg-instruct-wizardlm## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/pyg-instruct-wizardlm on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T08:03:29.005419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
021578895d7a8e7ba37f9bf1976b2971d3442030
# Dataset Card for Evaluation run of Lazycuber/Janemalion-6B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lazycuber/Janemalion-6B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Lazycuber/Janemalion-6B](https://huggingface.co/Lazycuber/Janemalion-6B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lazycuber__Janemalion-6B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-24T11:00:29.262151](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__Janemalion-6B/blob/main/results_2023-07-24T11%3A00%3A29.262151.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.28835373902195927, "acc_stderr": 0.03262443339491777, "acc_norm": 0.29201970368726204, "acc_norm_stderr": 0.032622091393243714, "mc1": 0.20930232558139536, "mc1_stderr": 0.01424121943478583, "mc2": 0.34587723098212036, "mc2_stderr": 0.01348699014658101 }, "harness|arc:challenge|25": { "acc": 0.386518771331058, "acc_stderr": 0.01423008476191048, "acc_norm": 0.42406143344709896, "acc_norm_stderr": 0.014441889627464398 }, "harness|hellaswag|10": { "acc": 0.5052778331009758, "acc_stderr": 0.004989503417767287, "acc_norm": 0.6840270862378013, "acc_norm_stderr": 0.00463952045344403 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2962962962962963, "acc_stderr": 0.03944624162501116, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.03944624162501116 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3157894736842105, "acc_stderr": 0.03782728980865469, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.03782728980865469 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768077, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768077 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.30943396226415093, "acc_stderr": 0.028450154794118627, "acc_norm": 0.30943396226415093, "acc_norm_stderr": 0.028450154794118627 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.17, "acc_stderr": 0.0377525168068637, "acc_norm": 0.17, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816507, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2658959537572254, "acc_stderr": 0.033687629322594316, "acc_norm": 0.2658959537572254, "acc_norm_stderr": 0.033687629322594316 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237655, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237655 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3446808510638298, "acc_stderr": 0.03106898596312215, "acc_norm": 0.3446808510638298, "acc_norm_stderr": 0.03106898596312215 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.044346007015849245, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.044346007015849245 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2482758620689655, "acc_stderr": 0.036001056927277716, "acc_norm": 0.2482758620689655, "acc_norm_stderr": 0.036001056927277716 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25132275132275134, "acc_stderr": 0.022340482339643898, "acc_norm": 0.25132275132275134, "acc_norm_stderr": 0.022340482339643898 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276864, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276864 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.25483870967741934, "acc_stderr": 0.024790118459332204, "acc_norm": 0.25483870967741934, "acc_norm_stderr": 0.024790118459332204 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2660098522167488, "acc_stderr": 0.031089826002937523, "acc_norm": 0.2660098522167488, "acc_norm_stderr": 0.031089826002937523 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2727272727272727, "acc_stderr": 0.03477691162163659, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.03477691162163659 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.21717171717171718, "acc_stderr": 0.029376616484945633, "acc_norm": 0.21717171717171718, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.2538860103626943, "acc_stderr": 0.03141024780565319, "acc_norm": 0.2538860103626943, "acc_norm_stderr": 0.03141024780565319 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.28974358974358977, "acc_stderr": 0.023000628243687968, "acc_norm": 0.28974358974358977, "acc_norm_stderr": 0.023000628243687968 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.026962424325073838, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.026962424325073838 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.23949579831932774, "acc_stderr": 0.02772206549336128, "acc_norm": 0.23949579831932774, "acc_norm_stderr": 0.02772206549336128 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2582781456953642, "acc_stderr": 0.035737053147634576, "acc_norm": 0.2582781456953642, "acc_norm_stderr": 0.035737053147634576 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.26055045871559634, "acc_stderr": 0.018819182034850068, "acc_norm": 0.26055045871559634, "acc_norm_stderr": 0.018819182034850068 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.18055555555555555, "acc_stderr": 0.026232878971491656, "acc_norm": 0.18055555555555555, "acc_norm_stderr": 0.026232878971491656 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.29411764705882354, "acc_stderr": 0.031980016601150726, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.031980016601150726 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3037974683544304, "acc_stderr": 0.029936696387138594, "acc_norm": 0.3037974683544304, "acc_norm_stderr": 0.029936696387138594 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.34080717488789236, "acc_stderr": 0.0318114974705536, "acc_norm": 0.34080717488789236, "acc_norm_stderr": 0.0318114974705536 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.25190839694656486, "acc_stderr": 0.03807387116306086, "acc_norm": 0.25190839694656486, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.36363636363636365, "acc_stderr": 0.04391326286724071, "acc_norm": 0.36363636363636365, "acc_norm_stderr": 0.04391326286724071 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.32407407407407407, "acc_stderr": 0.04524596007030049, "acc_norm": 0.32407407407407407, "acc_norm_stderr": 0.04524596007030049 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25153374233128833, "acc_stderr": 0.034089978868575295, "acc_norm": 0.25153374233128833, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.27350427350427353, "acc_stderr": 0.029202540153431177, "acc_norm": 0.27350427350427353, "acc_norm_stderr": 0.029202540153431177 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.3243933588761175, "acc_stderr": 0.016740929047162706, "acc_norm": 0.3243933588761175, "acc_norm_stderr": 0.016740929047162706 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2947976878612717, "acc_stderr": 0.02454761779480383, "acc_norm": 0.2947976878612717, "acc_norm_stderr": 0.02454761779480383 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961459, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961459 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3431372549019608, "acc_stderr": 0.02718449890994162, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.02718449890994162 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.29260450160771706, "acc_stderr": 0.02583989833487798, "acc_norm": 0.29260450160771706, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.3148148148148148, "acc_stderr": 0.025842248700902168, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.025842248700902168 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2978723404255319, "acc_stderr": 0.027281608344469414, "acc_norm": 0.2978723404255319, "acc_norm_stderr": 0.027281608344469414 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.31681877444589307, "acc_stderr": 0.01188234995472301, "acc_norm": 0.31681877444589307, "acc_norm_stderr": 0.01188234995472301 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.23529411764705882, "acc_stderr": 0.02576725201085598, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.02576725201085598 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.27450980392156865, "acc_stderr": 0.018054027458815198, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.018054027458815198 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.35454545454545455, "acc_stderr": 0.04582004841505416, "acc_norm": 0.35454545454545455, "acc_norm_stderr": 0.04582004841505416 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.42857142857142855, "acc_stderr": 0.03168091161233882, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.03168091161233882 }, "harness|hendrycksTest-sociology|5": { "acc": 0.36318407960199006, "acc_stderr": 0.03400598505599014, "acc_norm": 0.36318407960199006, "acc_norm_stderr": 0.03400598505599014 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-virology|5": { "acc": 0.29518072289156627, "acc_stderr": 0.035509201856896294, "acc_norm": 0.29518072289156627, "acc_norm_stderr": 0.035509201856896294 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3157894736842105, "acc_stderr": 0.03565079670708311, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.03565079670708311 }, "harness|truthfulqa:mc|0": { "mc1": 0.20930232558139536, "mc1_stderr": 0.01424121943478583, "mc2": 0.34587723098212036, "mc2_stderr": 0.01348699014658101 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Lazycuber__Janemalion-6B
[ "region:us" ]
2023-08-17T22:56:53+00:00
{"pretty_name": "Evaluation run of Lazycuber/Janemalion-6B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lazycuber/Janemalion-6B](https://huggingface.co/Lazycuber/Janemalion-6B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__Janemalion-6B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-24T11:00:29.262151](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__Janemalion-6B/blob/main/results_2023-07-24T11%3A00%3A29.262151.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28835373902195927,\n \"acc_stderr\": 0.03262443339491777,\n \"acc_norm\": 0.29201970368726204,\n \"acc_norm_stderr\": 0.032622091393243714,\n \"mc1\": 0.20930232558139536,\n \"mc1_stderr\": 0.01424121943478583,\n \"mc2\": 0.34587723098212036,\n \"mc2_stderr\": 0.01348699014658101\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.386518771331058,\n \"acc_stderr\": 0.01423008476191048,\n \"acc_norm\": 0.42406143344709896,\n \"acc_norm_stderr\": 0.014441889627464398\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5052778331009758,\n \"acc_stderr\": 0.004989503417767287,\n \"acc_norm\": 0.6840270862378013,\n \"acc_norm_stderr\": 0.00463952045344403\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.30943396226415093,\n \"acc_stderr\": 0.028450154794118627,\n \"acc_norm\": 0.30943396226415093,\n \"acc_norm_stderr\": 0.028450154794118627\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.033687629322594316,\n \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.033687629322594316\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332204,\n \"acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332204\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.023000628243687968,\n \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.023000628243687968\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.02772206549336128,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.02772206549336128\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26055045871559634,\n \"acc_stderr\": 0.018819182034850068,\n \"acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.018819182034850068\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491656,\n \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491656\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.031980016601150726,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.031980016601150726\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138594,\n \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138594\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.04524596007030049,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.04524596007030049\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n \"acc_stderr\": 0.029202540153431177,\n \"acc_norm\": 0.27350427350427353,\n \"acc_norm_stderr\": 0.029202540153431177\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3243933588761175,\n \"acc_stderr\": 0.016740929047162706,\n \"acc_norm\": 0.3243933588761175,\n \"acc_norm_stderr\": 0.016740929047162706\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.02454761779480383,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.02454761779480383\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.02718449890994162,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.02718449890994162\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.025842248700902168,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.025842248700902168\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.027281608344469414,\n \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.027281608344469414\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31681877444589307,\n \"acc_stderr\": 0.01188234995472301,\n \"acc_norm\": 0.31681877444589307,\n \"acc_norm_stderr\": 0.01188234995472301\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02576725201085598,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02576725201085598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815198,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815198\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03168091161233882,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03168091161233882\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.36318407960199006,\n \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.36318407960199006,\n \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20930232558139536,\n \"mc1_stderr\": 0.01424121943478583,\n \"mc2\": 0.34587723098212036,\n \"mc2_stderr\": 0.01348699014658101\n }\n}\n```", "repo_url": "https://huggingface.co/Lazycuber/Janemalion-6B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:00:29.262151.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:00:29.262151.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_00_29.262151", "path": ["results_2023-07-24T11:00:29.262151.parquet"]}, {"split": "latest", "path": ["results_2023-07-24T11:00:29.262151.parquet"]}]}]}
2023-08-27T11:25:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lazycuber/Janemalion-6B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Lazycuber/Janemalion-6B on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-24T11:00:29.262151 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Lazycuber/Janemalion-6B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/Janemalion-6B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T11:00:29.262151 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lazycuber/Janemalion-6B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/Janemalion-6B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-24T11:00:29.262151 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lazycuber/Janemalion-6B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/Janemalion-6B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-24T11:00:29.262151 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
55e45ec7083a0362f288da011916a9994e148dbe
# Dataset Card for Evaluation run of edor/Hermes-Platypus2-mini-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/edor/Hermes-Platypus2-mini-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [edor/Hermes-Platypus2-mini-7B](https://huggingface.co/edor/Hermes-Platypus2-mini-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-16T10:47:02.037059](https://huggingface.co/datasets/open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B/blob/main/results_2023-08-16T10%3A47%3A02.037059.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4739285188775824, "acc_stderr": 0.035185125877572575, "acc_norm": 0.4774082437104984, "acc_norm_stderr": 0.035170487487277746, "mc1": 0.3329253365973072, "mc1_stderr": 0.016497402382012055, "mc2": 0.49276058409873585, "mc2_stderr": 0.01516224977207343 }, "harness|arc:challenge|25": { "acc": 0.523037542662116, "acc_stderr": 0.014595873205358269, "acc_norm": 0.537542662116041, "acc_norm_stderr": 0.014570144495075581 }, "harness|hellaswag|10": { "acc": 0.6015733917546305, "acc_stderr": 0.004885735963346904, "acc_norm": 0.7923720374427405, "acc_norm_stderr": 0.0040477996462346365 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.42105263157894735, "acc_stderr": 0.040179012759817494, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.040179012759817494 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5056603773584906, "acc_stderr": 0.030770900763851316, "acc_norm": 0.5056603773584906, "acc_norm_stderr": 0.030770900763851316 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5, "acc_stderr": 0.04181210050035455, "acc_norm": 0.5, "acc_norm_stderr": 0.04181210050035455 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4161849710982659, "acc_stderr": 0.03758517775404947, "acc_norm": 0.4161849710982659, "acc_norm_stderr": 0.03758517775404947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179962, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179962 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4, "acc_stderr": 0.03202563076101735, "acc_norm": 0.4, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.041424397194893624, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.041424397194893624 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.43448275862068964, "acc_stderr": 0.04130740879555497, "acc_norm": 0.43448275862068964, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30158730158730157, "acc_stderr": 0.0236369759961018, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.0236369759961018 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3333333333333333, "acc_stderr": 0.042163702135578345, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.042163702135578345 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5225806451612903, "acc_stderr": 0.02841498501970786, "acc_norm": 0.5225806451612903, "acc_norm_stderr": 0.02841498501970786 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.33004926108374383, "acc_stderr": 0.033085304262282574, "acc_norm": 0.33004926108374383, "acc_norm_stderr": 0.033085304262282574 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6181818181818182, "acc_stderr": 0.03793713171165635, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.03793713171165635 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5707070707070707, "acc_stderr": 0.035265527246012, "acc_norm": 0.5707070707070707, "acc_norm_stderr": 0.035265527246012 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6683937823834197, "acc_stderr": 0.03397636541089118, "acc_norm": 0.6683937823834197, "acc_norm_stderr": 0.03397636541089118 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4307692307692308, "acc_stderr": 0.02510682066053975, "acc_norm": 0.4307692307692308, "acc_norm_stderr": 0.02510682066053975 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.026466117538959912, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.026466117538959912 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.031968769891957786, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.031968769891957786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.26490066225165565, "acc_stderr": 0.036030385453603826, "acc_norm": 0.26490066225165565, "acc_norm_stderr": 0.036030385453603826 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6440366972477064, "acc_stderr": 0.020528559278244214, "acc_norm": 0.6440366972477064, "acc_norm_stderr": 0.020528559278244214 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.27314814814814814, "acc_stderr": 0.030388051301678116, "acc_norm": 0.27314814814814814, "acc_norm_stderr": 0.030388051301678116 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6323529411764706, "acc_stderr": 0.03384132045674119, "acc_norm": 0.6323529411764706, "acc_norm_stderr": 0.03384132045674119 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.030685820596610805, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.030685820596610805 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5515695067264574, "acc_stderr": 0.03337883736255098, "acc_norm": 0.5515695067264574, "acc_norm_stderr": 0.03337883736255098 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5190839694656488, "acc_stderr": 0.04382094705550988, "acc_norm": 0.5190839694656488, "acc_norm_stderr": 0.04382094705550988 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6611570247933884, "acc_stderr": 0.043207678075366705, "acc_norm": 0.6611570247933884, "acc_norm_stderr": 0.043207678075366705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5370370370370371, "acc_stderr": 0.04820403072760628, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.04820403072760628 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4601226993865031, "acc_stderr": 0.03915857291436971, "acc_norm": 0.4601226993865031, "acc_norm_stderr": 0.03915857291436971 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.5825242718446602, "acc_stderr": 0.048828405482122375, "acc_norm": 0.5825242718446602, "acc_norm_stderr": 0.048828405482122375 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7222222222222222, "acc_stderr": 0.02934311479809444, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.02934311479809444 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.55, "acc_stderr": 0.04999999999999999, "acc_norm": 0.55, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6577266922094508, "acc_stderr": 0.016967031766413624, "acc_norm": 0.6577266922094508, "acc_norm_stderr": 0.016967031766413624 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5346820809248555, "acc_stderr": 0.026854257928258875, "acc_norm": 0.5346820809248555, "acc_norm_stderr": 0.026854257928258875 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25251396648044694, "acc_stderr": 0.014530330201468636, "acc_norm": 0.25251396648044694, "acc_norm_stderr": 0.014530330201468636 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.49673202614379086, "acc_stderr": 0.028629305194003543, "acc_norm": 0.49673202614379086, "acc_norm_stderr": 0.028629305194003543 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5691318327974276, "acc_stderr": 0.028125340983972714, "acc_norm": 0.5691318327974276, "acc_norm_stderr": 0.028125340983972714 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5061728395061729, "acc_stderr": 0.027818623962583295, "acc_norm": 0.5061728395061729, "acc_norm_stderr": 0.027818623962583295 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3900709219858156, "acc_stderr": 0.029097675599463926, "acc_norm": 0.3900709219858156, "acc_norm_stderr": 0.029097675599463926 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3539765319426336, "acc_stderr": 0.012213504731731637, "acc_norm": 0.3539765319426336, "acc_norm_stderr": 0.012213504731731637 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.47058823529411764, "acc_stderr": 0.030320243265004137, "acc_norm": 0.47058823529411764, "acc_norm_stderr": 0.030320243265004137 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.44607843137254904, "acc_stderr": 0.02010986454718136, "acc_norm": 0.44607843137254904, "acc_norm_stderr": 0.02010986454718136 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5181818181818182, "acc_stderr": 0.04785964010794916, "acc_norm": 0.5181818181818182, "acc_norm_stderr": 0.04785964010794916 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.563265306122449, "acc_stderr": 0.031751952375833226, "acc_norm": 0.563265306122449, "acc_norm_stderr": 0.031751952375833226 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6218905472636815, "acc_stderr": 0.034288678487786564, "acc_norm": 0.6218905472636815, "acc_norm_stderr": 0.034288678487786564 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6374269005847953, "acc_stderr": 0.0368713061556206, "acc_norm": 0.6374269005847953, "acc_norm_stderr": 0.0368713061556206 }, "harness|truthfulqa:mc|0": { "mc1": 0.3329253365973072, "mc1_stderr": 0.016497402382012055, "mc2": 0.49276058409873585, "mc2_stderr": 0.01516224977207343 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B
[ "region:us" ]
2023-08-17T22:57:01+00:00
{"pretty_name": "Evaluation run of edor/Hermes-Platypus2-mini-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [edor/Hermes-Platypus2-mini-7B](https://huggingface.co/edor/Hermes-Platypus2-mini-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-16T10:47:02.037059](https://huggingface.co/datasets/open-llm-leaderboard/details_edor__Hermes-Platypus2-mini-7B/blob/main/results_2023-08-16T10%3A47%3A02.037059.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4739285188775824,\n \"acc_stderr\": 0.035185125877572575,\n \"acc_norm\": 0.4774082437104984,\n \"acc_norm_stderr\": 0.035170487487277746,\n \"mc1\": 0.3329253365973072,\n \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49276058409873585,\n \"mc2_stderr\": 0.01516224977207343\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.014595873205358269,\n \"acc_norm\": 0.537542662116041,\n \"acc_norm_stderr\": 0.014570144495075581\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6015733917546305,\n \"acc_stderr\": 0.004885735963346904,\n \"acc_norm\": 0.7923720374427405,\n \"acc_norm_stderr\": 0.0040477996462346365\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851316,\n \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851316\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5707070707070707,\n \"acc_stderr\": 0.035265527246012,\n \"acc_norm\": 0.5707070707070707,\n \"acc_norm_stderr\": 0.035265527246012\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.02510682066053975,\n \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.02510682066053975\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6440366972477064,\n \"acc_stderr\": 0.020528559278244214,\n \"acc_norm\": 0.6440366972477064,\n \"acc_norm_stderr\": 0.020528559278244214\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674119,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674119\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610805,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610805\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436971,\n \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436971\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02934311479809444,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02934311479809444\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n \"acc_stderr\": 0.016967031766413624,\n \"acc_norm\": 0.6577266922094508,\n \"acc_norm_stderr\": 0.016967031766413624\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258875,\n \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258875\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n \"acc_stderr\": 0.014530330201468636,\n \"acc_norm\": 0.25251396648044694,\n \"acc_norm_stderr\": 0.014530330201468636\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3539765319426336,\n \"acc_stderr\": 0.012213504731731637,\n \"acc_norm\": 0.3539765319426336,\n \"acc_norm_stderr\": 0.012213504731731637\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.44607843137254904,\n \"acc_stderr\": 0.02010986454718136,\n \"acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.02010986454718136\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.031751952375833226,\n \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.031751952375833226\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.0368713061556206,\n \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.0368713061556206\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3329253365973072,\n \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49276058409873585,\n \"mc2_stderr\": 0.01516224977207343\n }\n}\n```", "repo_url": "https://huggingface.co/edor/Hermes-Platypus2-mini-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|arc:challenge|25_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hellaswag|10_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T10:47:02.037059.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T10_47_02.037059", "path": ["results_2023-08-16T10:47:02.037059.parquet"]}, {"split": "latest", "path": ["results_2023-08-16T10:47:02.037059.parquet"]}]}]}
2023-08-27T11:25:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of edor/Hermes-Platypus2-mini-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model edor/Hermes-Platypus2-mini-7B on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-16T10:47:02.037059 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of edor/Hermes-Platypus2-mini-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model edor/Hermes-Platypus2-mini-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-16T10:47:02.037059 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of edor/Hermes-Platypus2-mini-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model edor/Hermes-Platypus2-mini-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-16T10:47:02.037059 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of edor/Hermes-Platypus2-mini-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model edor/Hermes-Platypus2-mini-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-16T10:47:02.037059 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d4296e81c6189094990e4d7ac97bd735de28fdfd
# Dataset Card for Evaluation run of edor/Stable-Platypus2-mini-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/edor/Stable-Platypus2-mini-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [edor/Stable-Platypus2-mini-7B](https://huggingface.co/edor/Stable-Platypus2-mini-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_edor__Stable-Platypus2-mini-7B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-16T10:44:20.574252](https://huggingface.co/datasets/open-llm-leaderboard/details_edor__Stable-Platypus2-mini-7B/blob/main/results_2023-08-16T10%3A44%3A20.574252.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.519238503099194, "acc_stderr": 0.03487887571401071, "acc_norm": 0.5229272130971759, "acc_norm_stderr": 0.03486396112216957, "mc1": 0.3561811505507956, "mc1_stderr": 0.01676379072844634, "mc2": 0.5106039601116779, "mc2_stderr": 0.015454187246822623 }, "harness|arc:challenge|25": { "acc": 0.5238907849829352, "acc_stderr": 0.014594701798071654, "acc_norm": 0.5486348122866894, "acc_norm_stderr": 0.014542104569955267 }, "harness|hellaswag|10": { "acc": 0.5965943039235212, "acc_stderr": 0.004895782107786497, "acc_norm": 0.7894841665006971, "acc_norm_stderr": 0.0040684184172756635 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.40789473684210525, "acc_stderr": 0.03999309712777471, "acc_norm": 0.40789473684210525, "acc_norm_stderr": 0.03999309712777471 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5924528301886792, "acc_stderr": 0.03024223380085449, "acc_norm": 0.5924528301886792, "acc_norm_stderr": 0.03024223380085449 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666666, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666666 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4682080924855491, "acc_stderr": 0.03804749744364764, "acc_norm": 0.4682080924855491, "acc_norm_stderr": 0.03804749744364764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.04440521906179327, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.04440521906179327 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4765957446808511, "acc_stderr": 0.03265019475033582, "acc_norm": 0.4765957446808511, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.041424397194893624, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.041424397194893624 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4896551724137931, "acc_stderr": 0.04165774775728763, "acc_norm": 0.4896551724137931, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30158730158730157, "acc_stderr": 0.0236369759961018, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.0236369759961018 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.31746031746031744, "acc_stderr": 0.04163453031302859, "acc_norm": 0.31746031746031744, "acc_norm_stderr": 0.04163453031302859 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5645161290322581, "acc_stderr": 0.02820622559150274, "acc_norm": 0.5645161290322581, "acc_norm_stderr": 0.02820622559150274 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3448275862068966, "acc_stderr": 0.033442837442804574, "acc_norm": 0.3448275862068966, "acc_norm_stderr": 0.033442837442804574 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7212121212121212, "acc_stderr": 0.03501438706296781, "acc_norm": 0.7212121212121212, "acc_norm_stderr": 0.03501438706296781 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6414141414141414, "acc_stderr": 0.034169036403915214, "acc_norm": 0.6414141414141414, "acc_norm_stderr": 0.034169036403915214 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.030031147977641538, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.030031147977641538 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4948717948717949, "acc_stderr": 0.02534967290683866, "acc_norm": 0.4948717948717949, "acc_norm_stderr": 0.02534967290683866 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5168067226890757, "acc_stderr": 0.03246013680375308, "acc_norm": 0.5168067226890757, "acc_norm_stderr": 0.03246013680375308 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7321100917431193, "acc_stderr": 0.018987462257978652, "acc_norm": 0.7321100917431193, "acc_norm_stderr": 0.018987462257978652 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.03372343271653063, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.03372343271653063 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.03228210387037893, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.03228210387037893 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7130801687763713, "acc_stderr": 0.029443773022594693, "acc_norm": 0.7130801687763713, "acc_norm_stderr": 0.029443773022594693 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6143497757847534, "acc_stderr": 0.03266842214289201, "acc_norm": 0.6143497757847534, "acc_norm_stderr": 0.03266842214289201 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6859504132231405, "acc_stderr": 0.042369647530410184, "acc_norm": 0.6859504132231405, "acc_norm_stderr": 0.042369647530410184 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5460122699386503, "acc_stderr": 0.0391170190467718, "acc_norm": 0.5460122699386503, "acc_norm_stderr": 0.0391170190467718 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.782051282051282, "acc_stderr": 0.02704685763071669, "acc_norm": 0.782051282051282, "acc_norm_stderr": 0.02704685763071669 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7164750957854407, "acc_stderr": 0.01611731816683227, "acc_norm": 0.7164750957854407, "acc_norm_stderr": 0.01611731816683227 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5780346820809249, "acc_stderr": 0.026589231142174263, "acc_norm": 0.5780346820809249, "acc_norm_stderr": 0.026589231142174263 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2569832402234637, "acc_stderr": 0.01461446582196633, "acc_norm": 0.2569832402234637, "acc_norm_stderr": 0.01461446582196633 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5424836601307189, "acc_stderr": 0.028526383452142635, "acc_norm": 0.5424836601307189, "acc_norm_stderr": 0.028526383452142635 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5852090032154341, "acc_stderr": 0.027982680459759563, "acc_norm": 0.5852090032154341, "acc_norm_stderr": 0.027982680459759563 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5370370370370371, "acc_stderr": 0.027744313443376536, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.027744313443376536 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3900709219858156, "acc_stderr": 0.029097675599463926, "acc_norm": 0.3900709219858156, "acc_norm_stderr": 0.029097675599463926 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3917861799217731, "acc_stderr": 0.01246756441814513, "acc_norm": 0.3917861799217731, "acc_norm_stderr": 0.01246756441814513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.03035230339535197, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.03035230339535197 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5098039215686274, "acc_stderr": 0.0202239460050743, "acc_norm": 0.5098039215686274, "acc_norm_stderr": 0.0202239460050743 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.046534298079135075, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.046534298079135075 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6571428571428571, "acc_stderr": 0.030387262919547735, "acc_norm": 0.6571428571428571, "acc_norm_stderr": 0.030387262919547735 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6318407960199005, "acc_stderr": 0.03410410565495302, "acc_norm": 0.6318407960199005, "acc_norm_stderr": 0.03410410565495302 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.03851597683718534, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.03851597683718534 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.695906432748538, "acc_stderr": 0.03528211258245229, "acc_norm": 0.695906432748538, "acc_norm_stderr": 0.03528211258245229 }, "harness|truthfulqa:mc|0": { "mc1": 0.3561811505507956, "mc1_stderr": 0.01676379072844634, "mc2": 0.5106039601116779, "mc2_stderr": 0.015454187246822623 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_edor__Stable-Platypus2-mini-7B
[ "region:us" ]
2023-08-17T22:57:18+00:00
{"pretty_name": "Evaluation run of edor/Stable-Platypus2-mini-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [edor/Stable-Platypus2-mini-7B](https://huggingface.co/edor/Stable-Platypus2-mini-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_edor__Stable-Platypus2-mini-7B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-16T10:44:20.574252](https://huggingface.co/datasets/open-llm-leaderboard/details_edor__Stable-Platypus2-mini-7B/blob/main/results_2023-08-16T10%3A44%3A20.574252.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.519238503099194,\n \"acc_stderr\": 0.03487887571401071,\n \"acc_norm\": 0.5229272130971759,\n \"acc_norm_stderr\": 0.03486396112216957,\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.01676379072844634,\n \"mc2\": 0.5106039601116779,\n \"mc2_stderr\": 0.015454187246822623\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5238907849829352,\n \"acc_stderr\": 0.014594701798071654,\n \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955267\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5965943039235212,\n \"acc_stderr\": 0.004895782107786497,\n \"acc_norm\": 0.7894841665006971,\n \"acc_norm_stderr\": 0.0040684184172756635\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.03024223380085449,\n \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.03024223380085449\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5645161290322581,\n \"acc_stderr\": 0.02820622559150274,\n \"acc_norm\": 0.5645161290322581,\n \"acc_norm_stderr\": 0.02820622559150274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.033442837442804574,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.033442837442804574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.034169036403915214,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.034169036403915214\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7321100917431193,\n \"acc_stderr\": 0.018987462257978652,\n \"acc_norm\": 0.7321100917431193,\n \"acc_norm_stderr\": 0.018987462257978652\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037893,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037893\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.029443773022594693,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.029443773022594693\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n \"acc_stderr\": 0.02704685763071669,\n \"acc_norm\": 0.782051282051282,\n \"acc_norm_stderr\": 0.02704685763071669\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7164750957854407,\n \"acc_stderr\": 0.01611731816683227,\n \"acc_norm\": 0.7164750957854407,\n \"acc_norm_stderr\": 0.01611731816683227\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.026589231142174263,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.026589231142174263\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n \"acc_stderr\": 0.01461446582196633,\n \"acc_norm\": 0.2569832402234637,\n \"acc_norm_stderr\": 0.01461446582196633\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.028526383452142635,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.028526383452142635\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3917861799217731,\n \"acc_stderr\": 0.01246756441814513,\n \"acc_norm\": 0.3917861799217731,\n \"acc_norm_stderr\": 0.01246756441814513\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535197,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535197\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547735,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547735\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n \"acc_stderr\": 0.03410410565495302,\n \"acc_norm\": 0.6318407960199005,\n \"acc_norm_stderr\": 0.03410410565495302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.03851597683718534,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.03851597683718534\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.01676379072844634,\n \"mc2\": 0.5106039601116779,\n \"mc2_stderr\": 0.015454187246822623\n }\n}\n```", "repo_url": "https://huggingface.co/edor/Stable-Platypus2-mini-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|arc:challenge|25_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hellaswag|10_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T10:44:20.574252.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T10:44:20.574252.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T10_44_20.574252", "path": ["results_2023-08-16T10:44:20.574252.parquet"]}, {"split": "latest", "path": ["results_2023-08-16T10:44:20.574252.parquet"]}]}]}
2023-08-27T11:25:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of edor/Stable-Platypus2-mini-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model edor/Stable-Platypus2-mini-7B on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-16T10:44:20.574252 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of edor/Stable-Platypus2-mini-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model edor/Stable-Platypus2-mini-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-16T10:44:20.574252 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of edor/Stable-Platypus2-mini-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model edor/Stable-Platypus2-mini-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-16T10:44:20.574252 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of edor/Stable-Platypus2-mini-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model edor/Stable-Platypus2-mini-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-16T10:44:20.574252 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f173672e804125ca7cbda1809fffafab5a492cba
# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente-30b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aeala/GPT4-x-AlpacaDente-30b](https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T18:13:58.646455](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b/blob/main/results_2023-09-17T18-13-58.646455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.32120385906040266, "em_stderr": 0.004781891422636473, "f1": 0.43280620805369485, "f1_stderr": 0.0045611946956929435, "acc": 0.5439418899180396, "acc_stderr": 0.012071731077966974 }, "harness|drop|3": { "em": 0.32120385906040266, "em_stderr": 0.004781891422636473, "f1": 0.43280620805369485, "f1_stderr": 0.0045611946956929435 }, "harness|gsm8k|5": { "acc": 0.3009855951478393, "acc_stderr": 0.012634504465211194 }, "harness|winogrande|5": { "acc": 0.7868981846882399, "acc_stderr": 0.011508957690722754 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b
[ "region:us" ]
2023-08-17T22:57:27+00:00
{"pretty_name": "Evaluation run of Aeala/GPT4-x-AlpacaDente-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aeala/GPT4-x-AlpacaDente-30b](https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T18:13:58.646455](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente-30b/blob/main/results_2023-09-17T18-13-58.646455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32120385906040266,\n \"em_stderr\": 0.004781891422636473,\n \"f1\": 0.43280620805369485,\n \"f1_stderr\": 0.0045611946956929435,\n \"acc\": 0.5439418899180396,\n \"acc_stderr\": 0.012071731077966974\n },\n \"harness|drop|3\": {\n \"em\": 0.32120385906040266,\n \"em_stderr\": 0.004781891422636473,\n \"f1\": 0.43280620805369485,\n \"f1_stderr\": 0.0045611946956929435\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3009855951478393,\n \"acc_stderr\": 0.012634504465211194\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722754\n }\n}\n```", "repo_url": "https://huggingface.co/Aeala/GPT4-x-AlpacaDente-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|arc:challenge|25_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T18_13_58.646455", "path": ["**/details_harness|drop|3_2023-09-17T18-13-58.646455.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T18-13-58.646455.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T18_13_58.646455", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-13-58.646455.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-13-58.646455.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hellaswag|10_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T23:04:17.245052.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T23:04:17.245052.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T18_13_58.646455", "path": ["**/details_harness|winogrande|5_2023-09-17T18-13-58.646455.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T18-13-58.646455.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T23_04_17.245052", "path": ["results_2023-07-19T23:04:17.245052.parquet"]}, {"split": "2023_09_17T18_13_58.646455", "path": ["results_2023-09-17T18-13-58.646455.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T18-13-58.646455.parquet"]}]}]}
2023-09-17T17:14:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente-30b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente-30b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T18:13:58.646455(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T18:13:58.646455(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T18:13:58.646455(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T18:13:58.646455(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
da09939c0f0417fbd9941f2dadc7c65617125eda
# Dataset Card for Evaluation run of Aeala/GPT4-x-Alpasta-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aeala/GPT4-x-Alpasta-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aeala/GPT4-x-Alpasta-13b](https://huggingface.co/Aeala/GPT4-x-Alpasta-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aeala__GPT4-x-Alpasta-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T06:14:44.788892](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-Alpasta-13b/blob/main/results_2023-10-13T06-14-44.788892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.24108640939597314, "em_stderr": 0.004380484831888229, "f1": 0.3294389681208072, "f1_stderr": 0.004349222240156451, "acc": 0.4137438194609415, "acc_stderr": 0.010067997934742997 }, "harness|drop|3": { "em": 0.24108640939597314, "em_stderr": 0.004380484831888229, "f1": 0.3294389681208072, "f1_stderr": 0.004349222240156451 }, "harness|gsm8k|5": { "acc": 0.08794541319181198, "acc_stderr": 0.007801162197487709 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.012334833671998285 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aeala__GPT4-x-Alpasta-13b
[ "region:us" ]
2023-08-17T22:57:37+00:00
{"pretty_name": "Evaluation run of Aeala/GPT4-x-Alpasta-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aeala/GPT4-x-Alpasta-13b](https://huggingface.co/Aeala/GPT4-x-Alpasta-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aeala__GPT4-x-Alpasta-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T06:14:44.788892](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-Alpasta-13b/blob/main/results_2023-10-13T06-14-44.788892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24108640939597314,\n \"em_stderr\": 0.004380484831888229,\n \"f1\": 0.3294389681208072,\n \"f1_stderr\": 0.004349222240156451,\n \"acc\": 0.4137438194609415,\n \"acc_stderr\": 0.010067997934742997\n },\n \"harness|drop|3\": {\n \"em\": 0.24108640939597314,\n \"em_stderr\": 0.004380484831888229,\n \"f1\": 0.3294389681208072,\n \"f1_stderr\": 0.004349222240156451\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08794541319181198,\n \"acc_stderr\": 0.007801162197487709\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998285\n }\n}\n```", "repo_url": "https://huggingface.co/Aeala/GPT4-x-Alpasta-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T06_14_44.788892", "path": ["**/details_harness|drop|3_2023-10-13T06-14-44.788892.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T06-14-44.788892.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T06_14_44.788892", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-14-44.788892.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T06-14-44.788892.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:10:23.320662.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:10:23.320662.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:10:23.320662.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T06_14_44.788892", "path": ["**/details_harness|winogrande|5_2023-10-13T06-14-44.788892.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T06-14-44.788892.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_10_23.320662", "path": ["results_2023-07-19T19:10:23.320662.parquet"]}, {"split": "2023_10_13T06_14_44.788892", "path": ["results_2023-10-13T06-14-44.788892.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T06-14-44.788892.parquet"]}]}]}
2023-10-13T05:14:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aeala/GPT4-x-Alpasta-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aeala/GPT4-x-Alpasta-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T06:14:44.788892(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aeala/GPT4-x-Alpasta-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-Alpasta-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:14:44.788892(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aeala/GPT4-x-Alpasta-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-Alpasta-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T06:14:44.788892(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aeala/GPT4-x-Alpasta-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-Alpasta-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T06:14:44.788892(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
feb304dd10eb1e6c0629d2c8323f02f423a2f6a7
# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente2-30b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Aeala/GPT4-x-AlpacaDente2-30b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Aeala/GPT4-x-AlpacaDente2-30b](https://huggingface.co/Aeala/GPT4-x-AlpacaDente2-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente2-30b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T22:37:46.208428](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente2-30b/blob/main/results_2023-10-16T22-37-46.208428.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.38873741610738255, "em_stderr": 0.004992082219869434, "f1": 0.47062919463087477, "f1_stderr": 0.004742581525440341, "acc": 0.5245001564769177, "acc_stderr": 0.011905481321413287 }, "harness|drop|3": { "em": 0.38873741610738255, "em_stderr": 0.004992082219869434, "f1": 0.47062919463087477, "f1_stderr": 0.004742581525440341 }, "harness|gsm8k|5": { "acc": 0.2676269901440485, "acc_stderr": 0.012194764427053346 }, "harness|winogrande|5": { "acc": 0.7813733228097869, "acc_stderr": 0.011616198215773229 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente2-30b
[ "region:us" ]
2023-08-17T22:57:46+00:00
{"pretty_name": "Evaluation run of Aeala/GPT4-x-AlpacaDente2-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aeala/GPT4-x-AlpacaDente2-30b](https://huggingface.co/Aeala/GPT4-x-AlpacaDente2-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente2-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T22:37:46.208428](https://huggingface.co/datasets/open-llm-leaderboard/details_Aeala__GPT4-x-AlpacaDente2-30b/blob/main/results_2023-10-16T22-37-46.208428.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.38873741610738255,\n \"em_stderr\": 0.004992082219869434,\n \"f1\": 0.47062919463087477,\n \"f1_stderr\": 0.004742581525440341,\n \"acc\": 0.5245001564769177,\n \"acc_stderr\": 0.011905481321413287\n },\n \"harness|drop|3\": {\n \"em\": 0.38873741610738255,\n \"em_stderr\": 0.004992082219869434,\n \"f1\": 0.47062919463087477,\n \"f1_stderr\": 0.004742581525440341\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2676269901440485,\n \"acc_stderr\": 0.012194764427053346\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773229\n }\n}\n```", "repo_url": "https://huggingface.co/Aeala/GPT4-x-AlpacaDente2-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T22_37_46.208428", "path": ["**/details_harness|drop|3_2023-10-16T22-37-46.208428.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T22-37-46.208428.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T22_37_46.208428", "path": ["**/details_harness|gsm8k|5_2023-10-16T22-37-46.208428.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T22-37-46.208428.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:58:58.729379.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:58:58.729379.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:58:58.729379.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T22_37_46.208428", "path": ["**/details_harness|winogrande|5_2023-10-16T22-37-46.208428.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T22-37-46.208428.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_58_58.729379", "path": ["results_2023-07-19T22:58:58.729379.parquet"]}, {"split": "2023_10_16T22_37_46.208428", "path": ["results_2023-10-16T22-37-46.208428.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T22-37-46.208428.parquet"]}]}]}
2023-10-16T21:37:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente2-30b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente2-30b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T22:37:46.208428(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente2-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente2-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T22:37:46.208428(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente2-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente2-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T22:37:46.208428(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aeala/GPT4-x-AlpacaDente2-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Aeala/GPT4-x-AlpacaDente2-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T22:37:46.208428(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ce7ef36855032b1620c8b109bdcc303caa70df71
# Dataset Card for Evaluation run of acrastt/Vicuna-3B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/acrastt/Vicuna-3B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [acrastt/Vicuna-3B](https://huggingface.co/acrastt/Vicuna-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_acrastt__Vicuna-3B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-17T13:42:02.549031](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Vicuna-3B/blob/main/results_2023-08-17T13%3A42%3A02.549031.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2811009875581447, "acc_stderr": 0.03252062846238445, "acc_norm": 0.2849236147706273, "acc_norm_stderr": 0.03251779934164602, "mc1": 0.24479804161566707, "mc1_stderr": 0.01505186948671501, "mc2": 0.38343994316155305, "mc2_stderr": 0.013903929837677163 }, "harness|arc:challenge|25": { "acc": 0.3651877133105802, "acc_stderr": 0.014070265519268804, "acc_norm": 0.4129692832764505, "acc_norm_stderr": 0.014388344935398326 }, "harness|hellaswag|10": { "acc": 0.5407289384584744, "acc_stderr": 0.004973199296339957, "acc_norm": 0.7184823740290779, "acc_norm_stderr": 0.004488201756642574 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909281, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.23703703703703705, "acc_stderr": 0.03673731683969506, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.03673731683969506 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.26973684210526316, "acc_stderr": 0.03611780560284898, "acc_norm": 0.26973684210526316, "acc_norm_stderr": 0.03611780560284898 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2641509433962264, "acc_stderr": 0.027134291628741713, "acc_norm": 0.2641509433962264, "acc_norm_stderr": 0.027134291628741713 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.19, "acc_stderr": 0.03942772444036623, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036623 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2138728323699422, "acc_stderr": 0.03126511206173044, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.03126511206173044 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2647058823529412, "acc_stderr": 0.04389869956808778, "acc_norm": 0.2647058823529412, "acc_norm_stderr": 0.04389869956808778 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.33617021276595743, "acc_stderr": 0.030881618520676942, "acc_norm": 0.33617021276595743, "acc_norm_stderr": 0.030881618520676942 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21929824561403508, "acc_stderr": 0.03892431106518755, "acc_norm": 0.21929824561403508, "acc_norm_stderr": 0.03892431106518755 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2620689655172414, "acc_stderr": 0.036646663372252565, "acc_norm": 0.2620689655172414, "acc_norm_stderr": 0.036646663372252565 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.0230681888482611, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.0230681888482611 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.20634920634920634, "acc_stderr": 0.0361960452412425, "acc_norm": 0.20634920634920634, "acc_norm_stderr": 0.0361960452412425 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.22903225806451613, "acc_stderr": 0.023904914311782648, "acc_norm": 0.22903225806451613, "acc_norm_stderr": 0.023904914311782648 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2512315270935961, "acc_stderr": 0.030516530732694436, "acc_norm": 0.2512315270935961, "acc_norm_stderr": 0.030516530732694436 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.29, "acc_stderr": 0.04560480215720685, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720685 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.30303030303030304, "acc_stderr": 0.035886248000917075, "acc_norm": 0.30303030303030304, "acc_norm_stderr": 0.035886248000917075 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.25252525252525254, "acc_stderr": 0.030954055470365904, "acc_norm": 0.25252525252525254, "acc_norm_stderr": 0.030954055470365904 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.24352331606217617, "acc_stderr": 0.03097543638684543, "acc_norm": 0.24352331606217617, "acc_norm_stderr": 0.03097543638684543 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.258974358974359, "acc_stderr": 0.02221110681006166, "acc_norm": 0.258974358974359, "acc_norm_stderr": 0.02221110681006166 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.026335739404055803, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.026335739404055803 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.24789915966386555, "acc_stderr": 0.028047967224176896, "acc_norm": 0.24789915966386555, "acc_norm_stderr": 0.028047967224176896 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.03802039760107903, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.03802039760107903 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.26422018348623855, "acc_stderr": 0.01890416417151019, "acc_norm": 0.26422018348623855, "acc_norm_stderr": 0.01890416417151019 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.029157522184605607, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.029157522184605607 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.23039215686274508, "acc_stderr": 0.02955429260569507, "acc_norm": 0.23039215686274508, "acc_norm_stderr": 0.02955429260569507 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2869198312236287, "acc_stderr": 0.02944377302259469, "acc_norm": 0.2869198312236287, "acc_norm_stderr": 0.02944377302259469 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.40358744394618834, "acc_stderr": 0.032928028193303135, "acc_norm": 0.40358744394618834, "acc_norm_stderr": 0.032928028193303135 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.24427480916030533, "acc_stderr": 0.03768335959728745, "acc_norm": 0.24427480916030533, "acc_norm_stderr": 0.03768335959728745 }, "harness|hendrycksTest-international_law|5": { "acc": 0.30578512396694213, "acc_stderr": 0.04205953933884124, "acc_norm": 0.30578512396694213, "acc_norm_stderr": 0.04205953933884124 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.26851851851851855, "acc_stderr": 0.04284467968052192, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.04284467968052192 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22699386503067484, "acc_stderr": 0.032910995786157686, "acc_norm": 0.22699386503067484, "acc_norm_stderr": 0.032910995786157686 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.04007341809755805, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.04007341809755805 }, "harness|hendrycksTest-management|5": { "acc": 0.27184466019417475, "acc_stderr": 0.044052680241409216, "acc_norm": 0.27184466019417475, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2777777777777778, "acc_stderr": 0.02934311479809445, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.02934311479809445 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2681992337164751, "acc_stderr": 0.015842430835269445, "acc_norm": 0.2681992337164751, "acc_norm_stderr": 0.015842430835269445 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.30346820809248554, "acc_stderr": 0.024752411960917202, "acc_norm": 0.30346820809248554, "acc_norm_stderr": 0.024752411960917202 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.014422292204808835, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.014422292204808835 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.27450980392156865, "acc_stderr": 0.02555316999182651, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.02555316999182651 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.29260450160771706, "acc_stderr": 0.02583989833487798, "acc_norm": 0.29260450160771706, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.29012345679012347, "acc_stderr": 0.025251173936495022, "acc_norm": 0.29012345679012347, "acc_norm_stderr": 0.025251173936495022 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2765957446808511, "acc_stderr": 0.026684564340460987, "acc_norm": 0.2765957446808511, "acc_norm_stderr": 0.026684564340460987 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23989569752281617, "acc_stderr": 0.010906282617981641, "acc_norm": 0.23989569752281617, "acc_norm_stderr": 0.010906282617981641 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.22058823529411764, "acc_stderr": 0.025187786660227276, "acc_norm": 0.22058823529411764, "acc_norm_stderr": 0.025187786660227276 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.28594771241830064, "acc_stderr": 0.018280485072954676, "acc_norm": 0.28594771241830064, "acc_norm_stderr": 0.018280485072954676 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2636363636363636, "acc_stderr": 0.04220224692971987, "acc_norm": 0.2636363636363636, "acc_norm_stderr": 0.04220224692971987 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3673469387755102, "acc_stderr": 0.030862144921087558, "acc_norm": 0.3673469387755102, "acc_norm_stderr": 0.030862144921087558 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2736318407960199, "acc_stderr": 0.03152439186555402, "acc_norm": 0.2736318407960199, "acc_norm_stderr": 0.03152439186555402 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-virology|5": { "acc": 0.3192771084337349, "acc_stderr": 0.03629335329947861, "acc_norm": 0.3192771084337349, "acc_norm_stderr": 0.03629335329947861 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3157894736842105, "acc_stderr": 0.035650796707083106, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.035650796707083106 }, "harness|truthfulqa:mc|0": { "mc1": 0.24479804161566707, "mc1_stderr": 0.01505186948671501, "mc2": 0.38343994316155305, "mc2_stderr": 0.013903929837677163 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_acrastt__Vicuna-3B
[ "region:us" ]
2023-08-17T22:57:55+00:00
{"pretty_name": "Evaluation run of acrastt/Vicuna-3B", "dataset_summary": "Dataset automatically created during the evaluation run of model [acrastt/Vicuna-3B](https://huggingface.co/acrastt/Vicuna-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__Vicuna-3B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-17T13:42:02.549031](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Vicuna-3B/blob/main/results_2023-08-17T13%3A42%3A02.549031.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2811009875581447,\n \"acc_stderr\": 0.03252062846238445,\n \"acc_norm\": 0.2849236147706273,\n \"acc_norm_stderr\": 0.03251779934164602,\n \"mc1\": 0.24479804161566707,\n \"mc1_stderr\": 0.01505186948671501,\n \"mc2\": 0.38343994316155305,\n \"mc2_stderr\": 0.013903929837677163\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3651877133105802,\n \"acc_stderr\": 0.014070265519268804,\n \"acc_norm\": 0.4129692832764505,\n \"acc_norm_stderr\": 0.014388344935398326\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5407289384584744,\n \"acc_stderr\": 0.004973199296339957,\n \"acc_norm\": 0.7184823740290779,\n \"acc_norm_stderr\": 0.004488201756642574\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741713,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.03126511206173044,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.03126511206173044\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518755,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518755\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.0230681888482611,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.0230681888482611\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.22903225806451613,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.035886248000917075,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.035886248000917075\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365904,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365904\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.03097543638684543,\n \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.03097543638684543\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.02221110681006166,\n \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.02221110681006166\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26422018348623855,\n \"acc_stderr\": 0.01890416417151019,\n \"acc_norm\": 0.26422018348623855,\n \"acc_norm_stderr\": 0.01890416417151019\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605607,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605607\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.02955429260569507,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.02955429260569507\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.40358744394618834,\n \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.30578512396694213,\n \"acc_stderr\": 0.04205953933884124,\n \"acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.04205953933884124\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02934311479809445,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02934311479809445\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2681992337164751,\n \"acc_stderr\": 0.015842430835269445,\n \"acc_norm\": 0.2681992337164751,\n \"acc_norm_stderr\": 0.015842430835269445\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.024752411960917202,\n \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.024752411960917202\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.02555316999182651,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.02555316999182651\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460987,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460987\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n \"acc_stderr\": 0.010906282617981641,\n \"acc_norm\": 0.23989569752281617,\n \"acc_norm_stderr\": 0.010906282617981641\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.025187786660227276,\n \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.025187786660227276\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28594771241830064,\n \"acc_stderr\": 0.018280485072954676,\n \"acc_norm\": 0.28594771241830064,\n \"acc_norm_stderr\": 0.018280485072954676\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3673469387755102,\n \"acc_stderr\": 0.030862144921087558,\n \"acc_norm\": 0.3673469387755102,\n \"acc_norm_stderr\": 0.030862144921087558\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24479804161566707,\n \"mc1_stderr\": 0.01505186948671501,\n \"mc2\": 0.38343994316155305,\n \"mc2_stderr\": 0.013903929837677163\n }\n}\n```", "repo_url": "https://huggingface.co/acrastt/Vicuna-3B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|arc:challenge|25_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hellaswag|10_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T13:42:02.549031.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T13:42:02.549031.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T13_42_02.549031", "path": ["results_2023-08-17T13:42:02.549031.parquet"]}, {"split": "latest", "path": ["results_2023-08-17T13:42:02.549031.parquet"]}]}]}
2023-08-27T11:25:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of acrastt/Vicuna-3B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model acrastt/Vicuna-3B on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-17T13:42:02.549031 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of acrastt/Vicuna-3B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model acrastt/Vicuna-3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T13:42:02.549031 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of acrastt/Vicuna-3B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model acrastt/Vicuna-3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T13:42:02.549031 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of acrastt/Vicuna-3B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model acrastt/Vicuna-3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-17T13:42:02.549031 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b250f9c5d5636074da052fd804e8ea61d1e94157
# Dataset Card for Evaluation run of acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1](https://huggingface.co/acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_acrastt__RedPajama-INCITE-Chat-Instruct-3B-V1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T10:53:28.361871](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__RedPajama-INCITE-Chat-Instruct-3B-V1/blob/main/results_2023-10-15T10-53-28.361871.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0010486577181208054, "em_stderr": 0.00033145814652192537, "f1": 0.049296350671141, "f1_stderr": 0.001211652084009881, "acc": 0.328542586554474, "acc_stderr": 0.008019100667852693 }, "harness|drop|3": { "em": 0.0010486577181208054, "em_stderr": 0.00033145814652192537, "f1": 0.049296350671141, "f1_stderr": 0.001211652084009881 }, "harness|gsm8k|5": { "acc": 0.009097801364670205, "acc_stderr": 0.0026153265107756716 }, "harness|winogrande|5": { "acc": 0.6479873717442778, "acc_stderr": 0.013422874824929713 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_acrastt__RedPajama-INCITE-Chat-Instruct-3B-V1
[ "region:us" ]
2023-08-17T22:58:04+00:00
{"pretty_name": "Evaluation run of acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1", "dataset_summary": "Dataset automatically created during the evaluation run of model [acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1](https://huggingface.co/acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__RedPajama-INCITE-Chat-Instruct-3B-V1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T10:53:28.361871](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__RedPajama-INCITE-Chat-Instruct-3B-V1/blob/main/results_2023-10-15T10-53-28.361871.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192537,\n \"f1\": 0.049296350671141,\n \"f1_stderr\": 0.001211652084009881,\n \"acc\": 0.328542586554474,\n \"acc_stderr\": 0.008019100667852693\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192537,\n \"f1\": 0.049296350671141,\n \"f1_stderr\": 0.001211652084009881\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.0026153265107756716\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6479873717442778,\n \"acc_stderr\": 0.013422874824929713\n }\n}\n```", "repo_url": "https://huggingface.co/acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|arc:challenge|25_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T10_53_28.361871", "path": ["**/details_harness|drop|3_2023-10-15T10-53-28.361871.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T10-53-28.361871.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T10_53_28.361871", "path": ["**/details_harness|gsm8k|5_2023-10-15T10-53-28.361871.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T10-53-28.361871.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hellaswag|10_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T09:50:22.851617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T09:50:22.851617.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T09:50:22.851617.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T10_53_28.361871", "path": ["**/details_harness|winogrande|5_2023-10-15T10-53-28.361871.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T10-53-28.361871.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T09_50_22.851617", "path": ["results_2023-07-31T09:50:22.851617.parquet"]}, {"split": "2023_10_15T10_53_28.361871", "path": ["results_2023-10-15T10-53-28.361871.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T10-53-28.361871.parquet"]}]}]}
2023-10-15T09:53:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T10:53:28.361871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T10:53:28.361871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T10:53:28.361871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model acrastt/RedPajama-INCITE-Chat-Instruct-3B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T10:53:28.361871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ddce2300c40ab507627722b319ca3ebb1c24115c
# Dataset Card for Evaluation run of kevinpro/Vicuna-13B-CoT ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/kevinpro/Vicuna-13B-CoT - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [kevinpro/Vicuna-13B-CoT](https://huggingface.co/kevinpro/Vicuna-13B-CoT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T13:31:22.626797](https://huggingface.co/datasets/open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT/blob/main/results_2023-09-17T13-31-22.626797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.029677013422818792, "em_stderr": 0.0017378324714143493, "f1": 0.09310612416107406, "f1_stderr": 0.002167792401176146, "acc": 0.4141695683211732, "acc_stderr": 0.010019161585538096 }, "harness|drop|3": { "em": 0.029677013422818792, "em_stderr": 0.0017378324714143493, "f1": 0.09310612416107406, "f1_stderr": 0.002167792401176146 }, "harness|gsm8k|5": { "acc": 0.08642911296436695, "acc_stderr": 0.00774004433710381 }, "harness|winogrande|5": { "acc": 0.7419100236779794, "acc_stderr": 0.012298278833972384 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT
[ "region:us" ]
2023-08-17T22:58:12+00:00
{"pretty_name": "Evaluation run of kevinpro/Vicuna-13B-CoT", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevinpro/Vicuna-13B-CoT](https://huggingface.co/kevinpro/Vicuna-13B-CoT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T13:31:22.626797](https://huggingface.co/datasets/open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT/blob/main/results_2023-09-17T13-31-22.626797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146,\n \"acc\": 0.4141695683211732,\n \"acc_stderr\": 0.010019161585538096\n },\n \"harness|drop|3\": {\n \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.00774004433710381\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n }\n}\n```", "repo_url": "https://huggingface.co/kevinpro/Vicuna-13B-CoT", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T13_31_22.626797", "path": ["**/details_harness|drop|3_2023-09-17T13-31-22.626797.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T13-31-22.626797.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T13_31_22.626797", "path": ["**/details_harness|gsm8k|5_2023-09-17T13-31-22.626797.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T13-31-22.626797.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:33:25.891730.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:33:25.891730.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T13_31_22.626797", "path": ["**/details_harness|winogrande|5_2023-09-17T13-31-22.626797.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T13-31-22.626797.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_33_25.891730", "path": ["results_2023-07-19T18:33:25.891730.parquet"]}, {"split": "2023_09_17T13_31_22.626797", "path": ["results_2023-09-17T13-31-22.626797.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T13-31-22.626797.parquet"]}]}]}
2023-09-17T12:31:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kevinpro/Vicuna-13B-CoT ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model kevinpro/Vicuna-13B-CoT on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T13:31:22.626797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of kevinpro/Vicuna-13B-CoT", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model kevinpro/Vicuna-13B-CoT on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T13:31:22.626797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kevinpro/Vicuna-13B-CoT", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model kevinpro/Vicuna-13B-CoT on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T13:31:22.626797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kevinpro/Vicuna-13B-CoT## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kevinpro/Vicuna-13B-CoT on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T13:31:22.626797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3f2a623c0cd743c0780b04eae00709e64a70801d
# Dataset Card for Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T22:54:23.964972](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload/blob/main/results_2023-10-14T22-54-23.964972.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0020973154362416107, "em_stderr": 0.00046850650303681895, "f1": 0.05818162751677859, "f1_stderr": 0.0013245165484434952, "acc": 0.4407302535404773, "acc_stderr": 0.01044050090848239 }, "harness|drop|3": { "em": 0.0020973154362416107, "em_stderr": 0.00046850650303681895, "f1": 0.05818162751677859, "f1_stderr": 0.0013245165484434952 }, "harness|gsm8k|5": { "acc": 0.11902956785443518, "acc_stderr": 0.008919702911161629 }, "harness|winogrande|5": { "acc": 0.7624309392265194, "acc_stderr": 0.011961298905803152 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload
[ "region:us" ]
2023-08-17T22:58:21+00:00
{"pretty_name": "Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T22:54:23.964972](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload/blob/main/results_2023-10-14T22-54-23.964972.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303681895,\n \"f1\": 0.05818162751677859,\n \"f1_stderr\": 0.0013245165484434952,\n \"acc\": 0.4407302535404773,\n \"acc_stderr\": 0.01044050090848239\n },\n \"harness|drop|3\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303681895,\n \"f1\": 0.05818162751677859,\n \"f1_stderr\": 0.0013245165484434952\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \"acc_stderr\": 0.008919702911161629\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n }\n}\n```", "repo_url": "https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|arc:challenge|25_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T22_54_23.964972", "path": ["**/details_harness|drop|3_2023-10-14T22-54-23.964972.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T22-54-23.964972.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T22_54_23.964972", "path": ["**/details_harness|gsm8k|5_2023-10-14T22-54-23.964972.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T22-54-23.964972.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hellaswag|10_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T12:50:25.764084.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T12:50:25.764084.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T22_54_23.964972", "path": ["**/details_harness|winogrande|5_2023-10-14T22-54-23.964972.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T22-54-23.964972.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T12_50_25.764084", "path": ["results_2023-08-16T12:50:25.764084.parquet"]}, {"split": "2023_10_14T22_54_23.964972", "path": ["results_2023-10-14T22-54-23.964972.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T22-54-23.964972.parquet"]}]}]}
2023-10-14T21:54:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T22:54:23.964972(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T22:54:23.964972(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T22:54:23.964972(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 32, 31, 180, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T22:54:23.964972(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
db611247df1a6667b1c95adccffab742e610e4f4
# Dataset Card for Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Lajonbot/vicuna-7b-v1.5-PL-lora_unload](https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T03:39:03.666834](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload/blob/main/results_2023-09-23T03-39-03.666834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0065016778523489934, "em_stderr": 0.0008230684297223919, "f1": 0.06541946308724841, "f1_stderr": 0.0015883719778429714, "acc": 0.3959174184839032, "acc_stderr": 0.009871427981667812 }, "harness|drop|3": { "em": 0.0065016778523489934, "em_stderr": 0.0008230684297223919, "f1": 0.06541946308724841, "f1_stderr": 0.0015883719778429714 }, "harness|gsm8k|5": { "acc": 0.07202426080363912, "acc_stderr": 0.007121147983537124 }, "harness|winogrande|5": { "acc": 0.7198105761641673, "acc_stderr": 0.012621707979798499 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload
[ "region:us" ]
2023-08-17T22:58:33+00:00
{"pretty_name": "Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lajonbot/vicuna-7b-v1.5-PL-lora_unload](https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T03:39:03.666834](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload/blob/main/results_2023-09-23T03-39-03.666834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0065016778523489934,\n \"em_stderr\": 0.0008230684297223919,\n \"f1\": 0.06541946308724841,\n \"f1_stderr\": 0.0015883719778429714,\n \"acc\": 0.3959174184839032,\n \"acc_stderr\": 0.009871427981667812\n },\n \"harness|drop|3\": {\n \"em\": 0.0065016778523489934,\n \"em_stderr\": 0.0008230684297223919,\n \"f1\": 0.06541946308724841,\n \"f1_stderr\": 0.0015883719778429714\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07202426080363912,\n \"acc_stderr\": 0.007121147983537124\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798499\n }\n}\n```", "repo_url": "https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|arc:challenge|25_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T03_39_03.666834", "path": ["**/details_harness|drop|3_2023-09-23T03-39-03.666834.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T03-39-03.666834.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T03_39_03.666834", "path": ["**/details_harness|gsm8k|5_2023-09-23T03-39-03.666834.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T03-39-03.666834.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hellaswag|10_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T16:36:13.785976.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T16:36:13.785976.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T03_39_03.666834", "path": ["**/details_harness|winogrande|5_2023-09-23T03-39-03.666834.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T03-39-03.666834.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T16_36_13.785976", "path": ["results_2023-08-02T16:36:13.785976.parquet"]}, {"split": "2023_09_23T03_39_03.666834", "path": ["results_2023-09-23T03-39-03.666834.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T03-39-03.666834.parquet"]}]}]}
2023-09-23T02:39:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Lajonbot/vicuna-7b-v1.5-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T03:39:03.666834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/vicuna-7b-v1.5-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T03:39:03.666834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/vicuna-7b-v1.5-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T03:39:03.666834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/vicuna-7b-v1.5-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T03:39:03.666834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c5ea76f353660759eb7fb04af8b838ee7eaa8a29
# Dataset Card for Evaluation run of Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lajonbot__Llama-2-7b-chat-hf-instruct-pl-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T12:29:41.474491](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__Llama-2-7b-chat-hf-instruct-pl-lora_unload/blob/main/results_2023-09-17T12-29-41.474491.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0016778523489932886, "em_stderr": 0.0004191330178826897, "f1": 0.058242449664429756, "f1_stderr": 0.0013938395704941914, "acc": 0.39479575124777627, "acc_stderr": 0.009795911187121632 }, "harness|drop|3": { "em": 0.0016778523489932886, "em_stderr": 0.0004191330178826897, "f1": 0.058242449664429756, "f1_stderr": 0.0013938395704941914 }, "harness|gsm8k|5": { "acc": 0.06899166034874905, "acc_stderr": 0.006980995834838582 }, "harness|winogrande|5": { "acc": 0.7205998421468035, "acc_stderr": 0.01261082653940468 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Lajonbot__Llama-2-7b-chat-hf-instruct-pl-lora_unload
[ "region:us" ]
2023-08-17T22:58:41+00:00
{"pretty_name": "Evaluation run of Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__Llama-2-7b-chat-hf-instruct-pl-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T12:29:41.474491](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__Llama-2-7b-chat-hf-instruct-pl-lora_unload/blob/main/results_2023-09-17T12-29-41.474491.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826897,\n \"f1\": 0.058242449664429756,\n \"f1_stderr\": 0.0013938395704941914,\n \"acc\": 0.39479575124777627,\n \"acc_stderr\": 0.009795911187121632\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826897,\n \"f1\": 0.058242449664429756,\n \"f1_stderr\": 0.0013938395704941914\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06899166034874905,\n \"acc_stderr\": 0.006980995834838582\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.01261082653940468\n }\n}\n```", "repo_url": "https://huggingface.co/Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T12_29_41.474491", "path": ["**/details_harness|drop|3_2023-09-17T12-29-41.474491.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T12-29-41.474491.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T12_29_41.474491", "path": ["**/details_harness|gsm8k|5_2023-09-17T12-29-41.474491.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T12-29-41.474491.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:20:11.770272.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:20:11.770272.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:20:11.770272.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T12_29_41.474491", "path": ["**/details_harness|winogrande|5_2023-09-17T12-29-41.474491.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T12-29-41.474491.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T13_20_11.770272", "path": ["results_2023-08-01T13:20:11.770272.parquet"]}, {"split": "2023_09_17T12_29_41.474491", "path": ["results_2023-09-17T12-29-41.474491.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T12-29-41.474491.parquet"]}]}]}
2023-09-17T11:29:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T12:29:41.474491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T12:29:41.474491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T12:29:41.474491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 34, 31, 182, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/Llama-2-7b-chat-hf-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T12:29:41.474491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cdf3a8af2548244078a0a6a914138e9548eb90d1
# Dataset Card for Evaluation run of Lajonbot/vicuna-13b-v1.3-PL-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lajonbot/vicuna-13b-v1.3-PL-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Lajonbot/vicuna-13b-v1.3-PL-lora_unload](https://huggingface.co/Lajonbot/vicuna-13b-v1.3-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lajonbot__vicuna-13b-v1.3-PL-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T21:50:51.358366](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__vicuna-13b-v1.3-PL-lora_unload/blob/main/results_2023-10-14T21-50-51.358366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0036703020134228187, "em_stderr": 0.0006192871806511051, "f1": 0.06569735738255048, "f1_stderr": 0.0015051170747505127, "acc": 0.42553613539711327, "acc_stderr": 0.009940039476646956 }, "harness|drop|3": { "em": 0.0036703020134228187, "em_stderr": 0.0006192871806511051, "f1": 0.06569735738255048, "f1_stderr": 0.0015051170747505127 }, "harness|gsm8k|5": { "acc": 0.09021986353297953, "acc_stderr": 0.007891537108449993 }, "harness|winogrande|5": { "acc": 0.760852407261247, "acc_stderr": 0.011988541844843917 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Lajonbot__vicuna-13b-v1.3-PL-lora_unload
[ "region:us" ]
2023-08-17T22:58:50+00:00
{"pretty_name": "Evaluation run of Lajonbot/vicuna-13b-v1.3-PL-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lajonbot/vicuna-13b-v1.3-PL-lora_unload](https://huggingface.co/Lajonbot/vicuna-13b-v1.3-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__vicuna-13b-v1.3-PL-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T21:50:51.358366](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__vicuna-13b-v1.3-PL-lora_unload/blob/main/results_2023-10-14T21-50-51.358366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511051,\n \"f1\": 0.06569735738255048,\n \"f1_stderr\": 0.0015051170747505127,\n \"acc\": 0.42553613539711327,\n \"acc_stderr\": 0.009940039476646956\n },\n \"harness|drop|3\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511051,\n \"f1\": 0.06569735738255048,\n \"f1_stderr\": 0.0015051170747505127\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09021986353297953,\n \"acc_stderr\": 0.007891537108449993\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843917\n }\n}\n```", "repo_url": "https://huggingface.co/Lajonbot/vicuna-13b-v1.3-PL-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|arc:challenge|25_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T21_50_51.358366", "path": ["**/details_harness|drop|3_2023-10-14T21-50-51.358366.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T21-50-51.358366.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T21_50_51.358366", "path": ["**/details_harness|gsm8k|5_2023-10-14T21-50-51.358366.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T21-50-51.358366.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hellaswag|10_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T14:55:51.592566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T14:55:51.592566.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T14:55:51.592566.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T21_50_51.358366", "path": ["**/details_harness|winogrande|5_2023-10-14T21-50-51.358366.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T21-50-51.358366.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T14_55_51.592566", "path": ["results_2023-08-02T14:55:51.592566.parquet"]}, {"split": "2023_10_14T21_50_51.358366", "path": ["results_2023-10-14T21-50-51.358366.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T21-50-51.358366.parquet"]}]}]}
2023-10-14T20:51:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lajonbot/vicuna-13b-v1.3-PL-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Lajonbot/vicuna-13b-v1.3-PL-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T21:50:51.358366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Lajonbot/vicuna-13b-v1.3-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/vicuna-13b-v1.3-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T21:50:51.358366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lajonbot/vicuna-13b-v1.3-PL-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/vicuna-13b-v1.3-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T21:50:51.358366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lajonbot/vicuna-13b-v1.3-PL-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/vicuna-13b-v1.3-PL-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T21:50:51.358366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8c53f71bc0992a009633f6cdd502c840dc046717
# Dataset Card for Evaluation run of Lajonbot/tableBeluga-7B-instruct-pl-lora_unload ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Lajonbot/tableBeluga-7B-instruct-pl-lora_unload - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Lajonbot/tableBeluga-7B-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/tableBeluga-7B-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lajonbot__tableBeluga-7B-instruct-pl-lora_unload", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T19:20:10.302969](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__tableBeluga-7B-instruct-pl-lora_unload/blob/main/results_2023-09-17T19-20-10.302969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07602768456375839, "em_stderr": 0.0027142822886132433, "f1": 0.14862416107382526, "f1_stderr": 0.0030033713869214236, "acc": 0.4151299715828343, "acc_stderr": 0.009762520250486784 }, "harness|drop|3": { "em": 0.07602768456375839, "em_stderr": 0.0027142822886132433, "f1": 0.14862416107382526, "f1_stderr": 0.0030033713869214236 }, "harness|gsm8k|5": { "acc": 0.07808946171341925, "acc_stderr": 0.007390654481108218 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.01213438601986535 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Lajonbot__tableBeluga-7B-instruct-pl-lora_unload
[ "region:us" ]
2023-08-17T22:58:59+00:00
{"pretty_name": "Evaluation run of Lajonbot/tableBeluga-7B-instruct-pl-lora_unload", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lajonbot/tableBeluga-7B-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/tableBeluga-7B-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__tableBeluga-7B-instruct-pl-lora_unload\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T19:20:10.302969](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__tableBeluga-7B-instruct-pl-lora_unload/blob/main/results_2023-09-17T19-20-10.302969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07602768456375839,\n \"em_stderr\": 0.0027142822886132433,\n \"f1\": 0.14862416107382526,\n \"f1_stderr\": 0.0030033713869214236,\n \"acc\": 0.4151299715828343,\n \"acc_stderr\": 0.009762520250486784\n },\n \"harness|drop|3\": {\n \"em\": 0.07602768456375839,\n \"em_stderr\": 0.0027142822886132433,\n \"f1\": 0.14862416107382526,\n \"f1_stderr\": 0.0030033713869214236\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \"acc_stderr\": 0.007390654481108218\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n }\n}\n```", "repo_url": "https://huggingface.co/Lajonbot/tableBeluga-7B-instruct-pl-lora_unload", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|arc:challenge|25_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T19_20_10.302969", "path": ["**/details_harness|drop|3_2023-09-17T19-20-10.302969.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T19-20-10.302969.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T19_20_10.302969", "path": ["**/details_harness|gsm8k|5_2023-09-17T19-20-10.302969.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T19-20-10.302969.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hellaswag|10_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T09:13:12.299308.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T09:13:12.299308.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T09:13:12.299308.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T19_20_10.302969", "path": ["**/details_harness|winogrande|5_2023-09-17T19-20-10.302969.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T19-20-10.302969.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_03T09_13_12.299308", "path": ["results_2023-08-03T09:13:12.299308.parquet"]}, {"split": "2023_09_17T19_20_10.302969", "path": ["results_2023-09-17T19-20-10.302969.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T19-20-10.302969.parquet"]}]}]}
2023-09-17T18:20:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lajonbot/tableBeluga-7B-instruct-pl-lora_unload ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Lajonbot/tableBeluga-7B-instruct-pl-lora_unload on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T19:20:10.302969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Lajonbot/tableBeluga-7B-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/tableBeluga-7B-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T19:20:10.302969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lajonbot/tableBeluga-7B-instruct-pl-lora_unload", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/tableBeluga-7B-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T19:20:10.302969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lajonbot/tableBeluga-7B-instruct-pl-lora_unload## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lajonbot/tableBeluga-7B-instruct-pl-lora_unload on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T19:20:10.302969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
506ac83c48a8c2454347624f81f67bedb3c2489a
# Dataset Card for Evaluation run of CalderaAI/30B-Lazarus ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CalderaAI/30B-Lazarus - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [CalderaAI/30B-Lazarus](https://huggingface.co/CalderaAI/30B-Lazarus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CalderaAI__30B-Lazarus", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T10:30:29.206402](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__30B-Lazarus/blob/main/results_2023-10-13T10-30-29.206402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.15866191275167785, "em_stderr": 0.0037416337044887996, "f1": 0.2289985318791943, "f1_stderr": 0.003861278919536814, "acc": 0.43053621617869603, "acc_stderr": 0.009464164192315844 }, "harness|drop|3": { "em": 0.15866191275167785, "em_stderr": 0.0037416337044887996, "f1": 0.2289985318791943, "f1_stderr": 0.003861278919536814 }, "harness|gsm8k|5": { "acc": 0.07733131159969674, "acc_stderr": 0.007357713523222348 }, "harness|winogrande|5": { "acc": 0.7837411207576953, "acc_stderr": 0.01157061486140934 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CalderaAI__30B-Lazarus
[ "region:us" ]
2023-08-17T22:59:07+00:00
{"pretty_name": "Evaluation run of CalderaAI/30B-Lazarus", "dataset_summary": "Dataset automatically created during the evaluation run of model [CalderaAI/30B-Lazarus](https://huggingface.co/CalderaAI/30B-Lazarus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__30B-Lazarus\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T10:30:29.206402](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__30B-Lazarus/blob/main/results_2023-10-13T10-30-29.206402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15866191275167785,\n \"em_stderr\": 0.0037416337044887996,\n \"f1\": 0.2289985318791943,\n \"f1_stderr\": 0.003861278919536814,\n \"acc\": 0.43053621617869603,\n \"acc_stderr\": 0.009464164192315844\n },\n \"harness|drop|3\": {\n \"em\": 0.15866191275167785,\n \"em_stderr\": 0.0037416337044887996,\n \"f1\": 0.2289985318791943,\n \"f1_stderr\": 0.003861278919536814\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \"acc_stderr\": 0.007357713523222348\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140934\n }\n}\n```", "repo_url": "https://huggingface.co/CalderaAI/30B-Lazarus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T10_30_29.206402", "path": ["**/details_harness|drop|3_2023-10-13T10-30-29.206402.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T10-30-29.206402.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T10_30_29.206402", "path": ["**/details_harness|gsm8k|5_2023-10-13T10-30-29.206402.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T10-30-29.206402.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:16:39.327210.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:16:39.327210.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:16:39.327210.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T10_30_29.206402", "path": ["**/details_harness|winogrande|5_2023-10-13T10-30-29.206402.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T10-30-29.206402.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_16_39.327210", "path": ["results_2023-07-19T22:16:39.327210.parquet"]}, {"split": "2023_10_13T10_30_29.206402", "path": ["results_2023-10-13T10-30-29.206402.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T10-30-29.206402.parquet"]}]}]}
2023-10-13T09:30:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CalderaAI/30B-Lazarus ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model CalderaAI/30B-Lazarus on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T10:30:29.206402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of CalderaAI/30B-Lazarus", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/30B-Lazarus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T10:30:29.206402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CalderaAI/30B-Lazarus", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/30B-Lazarus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T10:30:29.206402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 68, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CalderaAI/30B-Lazarus## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/30B-Lazarus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T10:30:29.206402(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7af9428f881feb9ef2be7ddd9d6579aa2db34e6b
# Dataset Card for Evaluation run of CalderaAI/13B-Ouroboros ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CalderaAI/13B-Ouroboros - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [CalderaAI/13B-Ouroboros](https://huggingface.co/CalderaAI/13B-Ouroboros) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CalderaAI__13B-Ouroboros", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T23:20:58.844549](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Ouroboros/blob/main/results_2023-10-15T23-20-58.844549.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.07781040268456375, "em_stderr": 0.0027432702403905524, "f1": 0.15357172818791914, "f1_stderr": 0.003043981707354766, "acc": 0.29154043297731597, "acc_stderr": 0.007865813710750614 }, "harness|drop|3": { "em": 0.07781040268456375, "em_stderr": 0.0027432702403905524, "f1": 0.15357172818791914, "f1_stderr": 0.003043981707354766 }, "harness|gsm8k|5": { "acc": 0.004548900682335102, "acc_stderr": 0.0018535550440036204 }, "harness|winogrande|5": { "acc": 0.5785319652722968, "acc_stderr": 0.013878072377497606 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CalderaAI__13B-Ouroboros
[ "region:us" ]
2023-08-17T22:59:16+00:00
{"pretty_name": "Evaluation run of CalderaAI/13B-Ouroboros", "dataset_summary": "Dataset automatically created during the evaluation run of model [CalderaAI/13B-Ouroboros](https://huggingface.co/CalderaAI/13B-Ouroboros) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__13B-Ouroboros\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T23:20:58.844549](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Ouroboros/blob/main/results_2023-10-15T23-20-58.844549.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07781040268456375,\n \"em_stderr\": 0.0027432702403905524,\n \"f1\": 0.15357172818791914,\n \"f1_stderr\": 0.003043981707354766,\n \"acc\": 0.29154043297731597,\n \"acc_stderr\": 0.007865813710750614\n },\n \"harness|drop|3\": {\n \"em\": 0.07781040268456375,\n \"em_stderr\": 0.0027432702403905524,\n \"f1\": 0.15357172818791914,\n \"f1_stderr\": 0.003043981707354766\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5785319652722968,\n \"acc_stderr\": 0.013878072377497606\n }\n}\n```", "repo_url": "https://huggingface.co/CalderaAI/13B-Ouroboros", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T23_20_58.844549", "path": ["**/details_harness|drop|3_2023-10-15T23-20-58.844549.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T23-20-58.844549.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T23_20_58.844549", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-20-58.844549.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-20-58.844549.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:46:48.892044.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:46:48.892044.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:46:48.892044.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T23_20_58.844549", "path": ["**/details_harness|winogrande|5_2023-10-15T23-20-58.844549.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T23-20-58.844549.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T14_46_48.892044", "path": ["results_2023-07-24T14:46:48.892044.parquet"]}, {"split": "2023_10_15T23_20_58.844549", "path": ["results_2023-10-15T23-20-58.844549.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T23-20-58.844549.parquet"]}]}]}
2023-10-15T22:21:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CalderaAI/13B-Ouroboros ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model CalderaAI/13B-Ouroboros on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T23:20:58.844549(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of CalderaAI/13B-Ouroboros", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-Ouroboros on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T23:20:58.844549(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CalderaAI/13B-Ouroboros", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-Ouroboros on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T23:20:58.844549(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CalderaAI/13B-Ouroboros## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-Ouroboros on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T23:20:58.844549(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a6dcc5560856e72d33db264edf947505896c391e
# Dataset Card for Evaluation run of CalderaAI/13B-BlueMethod ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CalderaAI/13B-BlueMethod - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [CalderaAI/13B-BlueMethod](https://huggingface.co/CalderaAI/13B-BlueMethod) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CalderaAI__13B-BlueMethod", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-20T16:03:50.235184](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-BlueMethod/blob/main/results_2023-09-20T16-03-50.235184.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3099832214765101, "em_stderr": 0.0047362931024528656, "f1": 0.3761765939597331, "f1_stderr": 0.0046456997549096015, "acc": 0.4246011633744681, "acc_stderr": 0.009599007352566805 }, "harness|drop|3": { "em": 0.3099832214765101, "em_stderr": 0.0047362931024528656, "f1": 0.3761765939597331, "f1_stderr": 0.0046456997549096015 }, "harness|gsm8k|5": { "acc": 0.07808946171341925, "acc_stderr": 0.0073906544811082366 }, "harness|winogrande|5": { "acc": 0.771112865035517, "acc_stderr": 0.011807360224025376 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CalderaAI__13B-BlueMethod
[ "region:us" ]
2023-08-17T22:59:25+00:00
{"pretty_name": "Evaluation run of CalderaAI/13B-BlueMethod", "dataset_summary": "Dataset automatically created during the evaluation run of model [CalderaAI/13B-BlueMethod](https://huggingface.co/CalderaAI/13B-BlueMethod) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__13B-BlueMethod\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-20T16:03:50.235184](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-BlueMethod/blob/main/results_2023-09-20T16-03-50.235184.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3099832214765101,\n \"em_stderr\": 0.0047362931024528656,\n \"f1\": 0.3761765939597331,\n \"f1_stderr\": 0.0046456997549096015,\n \"acc\": 0.4246011633744681,\n \"acc_stderr\": 0.009599007352566805\n },\n \"harness|drop|3\": {\n \"em\": 0.3099832214765101,\n \"em_stderr\": 0.0047362931024528656,\n \"f1\": 0.3761765939597331,\n \"f1_stderr\": 0.0046456997549096015\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \"acc_stderr\": 0.0073906544811082366\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025376\n }\n}\n```", "repo_url": "https://huggingface.co/CalderaAI/13B-BlueMethod", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_20T16_03_50.235184", "path": ["**/details_harness|drop|3_2023-09-20T16-03-50.235184.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-20T16-03-50.235184.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_20T16_03_50.235184", "path": ["**/details_harness|gsm8k|5_2023-09-20T16-03-50.235184.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-20T16-03-50.235184.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T13:36:47.122036.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:36:47.122036.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T13:36:47.122036.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_20T16_03_50.235184", "path": ["**/details_harness|winogrande|5_2023-09-20T16-03-50.235184.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-20T16-03-50.235184.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T13_36_47.122036", "path": ["results_2023-07-24T13:36:47.122036.parquet"]}, {"split": "2023_09_20T16_03_50.235184", "path": ["results_2023-09-20T16-03-50.235184.parquet"]}, {"split": "latest", "path": ["results_2023-09-20T16-03-50.235184.parquet"]}]}]}
2023-09-20T15:04:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CalderaAI/13B-BlueMethod ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model CalderaAI/13B-BlueMethod on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-20T16:03:50.235184(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of CalderaAI/13B-BlueMethod", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-BlueMethod on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-20T16:03:50.235184(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CalderaAI/13B-BlueMethod", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-BlueMethod on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-20T16:03:50.235184(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CalderaAI/13B-BlueMethod## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-BlueMethod on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-20T16:03:50.235184(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
39a1a4736b93a61561448a7625be1416c699100c
# Dataset Card for Evaluation run of CalderaAI/13B-Legerdemain-L2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/CalderaAI/13B-Legerdemain-L2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [CalderaAI/13B-Legerdemain-L2](https://huggingface.co/CalderaAI/13B-Legerdemain-L2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-12T20:33:10.328879](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2/blob/main/results_2023-10-12T20-33-10.328879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002726510067114094, "em_stderr": 0.0005340111700415904, "f1": 0.06216547818791966, "f1_stderr": 0.0013785278979549318, "acc": 0.4412861505062612, "acc_stderr": 0.010705008172209724 }, "harness|drop|3": { "em": 0.002726510067114094, "em_stderr": 0.0005340111700415904, "f1": 0.06216547818791966, "f1_stderr": 0.0013785278979549318 }, "harness|gsm8k|5": { "acc": 0.13040181956027294, "acc_stderr": 0.0092756303245541 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.01213438601986535 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2
[ "region:us" ]
2023-08-17T22:59:34+00:00
{"pretty_name": "Evaluation run of CalderaAI/13B-Legerdemain-L2", "dataset_summary": "Dataset automatically created during the evaluation run of model [CalderaAI/13B-Legerdemain-L2](https://huggingface.co/CalderaAI/13B-Legerdemain-L2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T20:33:10.328879](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__13B-Legerdemain-L2/blob/main/results_2023-10-12T20-33-10.328879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415904,\n \"f1\": 0.06216547818791966,\n \"f1_stderr\": 0.0013785278979549318,\n \"acc\": 0.4412861505062612,\n \"acc_stderr\": 0.010705008172209724\n },\n \"harness|drop|3\": {\n \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415904,\n \"f1\": 0.06216547818791966,\n \"f1_stderr\": 0.0013785278979549318\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13040181956027294,\n \"acc_stderr\": 0.0092756303245541\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n }\n}\n```", "repo_url": "https://huggingface.co/CalderaAI/13B-Legerdemain-L2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T20_33_10.328879", "path": ["**/details_harness|drop|3_2023-10-12T20-33-10.328879.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T20-33-10.328879.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T20_33_10.328879", "path": ["**/details_harness|gsm8k|5_2023-10-12T20-33-10.328879.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T20-33-10.328879.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:34:37.986977.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:34:37.986977.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T20_33_10.328879", "path": ["**/details_harness|winogrande|5_2023-10-12T20-33-10.328879.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T20-33-10.328879.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_34_37.986977", "path": ["results_2023-08-09T11:34:37.986977.parquet"]}, {"split": "2023_10_12T20_33_10.328879", "path": ["results_2023-10-12T20-33-10.328879.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T20-33-10.328879.parquet"]}]}]}
2023-10-12T19:33:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CalderaAI/13B-Legerdemain-L2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model CalderaAI/13B-Legerdemain-L2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-12T20:33:10.328879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of CalderaAI/13B-Legerdemain-L2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-Legerdemain-L2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T20:33:10.328879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CalderaAI/13B-Legerdemain-L2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-Legerdemain-L2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T20:33:10.328879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CalderaAI/13B-Legerdemain-L2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CalderaAI/13B-Legerdemain-L2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T20:33:10.328879(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
072fd8fb9ac16df986a8d279a8cdde3a2c9a74f3
# Dataset Card for Evaluation run of mosaicml/mpt-30b-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mosaicml/mpt-30b-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [mosaicml/mpt-30b-chat](https://huggingface.co/mosaicml/mpt-30b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-30b-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T05:09:13.488891](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-30b-chat/blob/main/results_2023-10-16T05-09-13.488891.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0017827181208053692, "em_stderr": 0.00043200973460389824, "f1": 0.06156250000000011, "f1_stderr": 0.001404482472524456, "acc": 0.4371318828152442, "acc_stderr": 0.010557145720065584 }, "harness|drop|3": { "em": 0.0017827181208053692, "em_stderr": 0.00043200973460389824, "f1": 0.06156250000000011, "f1_stderr": 0.001404482472524456 }, "harness|gsm8k|5": { "acc": 0.12130401819560273, "acc_stderr": 0.008992888497275597 }, "harness|winogrande|5": { "acc": 0.7529597474348856, "acc_stderr": 0.01212140294285557 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_mosaicml__mpt-30b-chat
[ "region:us" ]
2023-08-17T22:59:42+00:00
{"pretty_name": "Evaluation run of mosaicml/mpt-30b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [mosaicml/mpt-30b-chat](https://huggingface.co/mosaicml/mpt-30b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-30b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T05:09:13.488891](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-30b-chat/blob/main/results_2023-10-16T05-09-13.488891.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460389824,\n \"f1\": 0.06156250000000011,\n \"f1_stderr\": 0.001404482472524456,\n \"acc\": 0.4371318828152442,\n \"acc_stderr\": 0.010557145720065584\n },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460389824,\n \"f1\": 0.06156250000000011,\n \"f1_stderr\": 0.001404482472524456\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12130401819560273,\n \"acc_stderr\": 0.008992888497275597\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.01212140294285557\n }\n}\n```", "repo_url": "https://huggingface.co/mosaicml/mpt-30b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|arc:challenge|25_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|arc:challenge|25_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T01_18_40.981025", "path": ["**/details_harness|drop|3_2023-10-13T01-18-40.981025.parquet"]}, {"split": "2023_10_16T05_09_13.488891", "path": ["**/details_harness|drop|3_2023-10-16T05-09-13.488891.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T05-09-13.488891.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T01_18_40.981025", "path": ["**/details_harness|gsm8k|5_2023-10-13T01-18-40.981025.parquet"]}, {"split": "2023_10_16T05_09_13.488891", "path": ["**/details_harness|gsm8k|5_2023-10-16T05-09-13.488891.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T05-09-13.488891.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hellaswag|10_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hellaswag|10_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T13:10:39.450497.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-04T01-12-49.842984.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-04T01-12-49.842984.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-04T01-12-49.842984.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T01_18_40.981025", "path": ["**/details_harness|winogrande|5_2023-10-13T01-18-40.981025.parquet"]}, {"split": "2023_10_16T05_09_13.488891", "path": ["**/details_harness|winogrande|5_2023-10-16T05-09-13.488891.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T05-09-13.488891.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T13_10_39.450497", "path": ["results_2023-07-20T13:10:39.450497.parquet"]}, {"split": "2023_10_04T01_12_49.842984", "path": ["results_2023-10-04T01-12-49.842984.parquet"]}, {"split": "2023_10_13T01_18_40.981025", "path": ["results_2023-10-13T01-18-40.981025.parquet"]}, {"split": "2023_10_16T05_09_13.488891", "path": ["results_2023-10-16T05-09-13.488891.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T05-09-13.488891.parquet"]}]}]}
2023-10-16T04:09:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mosaicml/mpt-30b-chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model mosaicml/mpt-30b-chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T05:09:13.488891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of mosaicml/mpt-30b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T05:09:13.488891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mosaicml/mpt-30b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T05:09:13.488891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mosaicml/mpt-30b-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T05:09:13.488891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e03697ee5489d0bb110688a6d1239ff779651050
# Dataset Card for Evaluation run of mosaicml/mpt-7b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mosaicml/mpt-7b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T07:03:23.990596](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-instruct/blob/main/results_2023-09-23T07-03-23.990596.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2429739932885906, "em_stderr": 0.004392127579519805, "f1": 0.2939712667785233, "f1_stderr": 0.004382684089142145, "acc": 0.3664330383509068, "acc_stderr": 0.00868382013779556 }, "harness|drop|3": { "em": 0.2429739932885906, "em_stderr": 0.004392127579519805, "f1": 0.2939712667785233, "f1_stderr": 0.004382684089142145 }, "harness|gsm8k|5": { "acc": 0.028051554207733132, "acc_stderr": 0.0045482295338363475 }, "harness|winogrande|5": { "acc": 0.7048145224940805, "acc_stderr": 0.012819410741754772 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_mosaicml__mpt-7b-instruct
[ "region:us" ]
2023-08-17T22:59:51+00:00
{"pretty_name": "Evaluation run of mosaicml/mpt-7b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T07:03:23.990596](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-instruct/blob/main/results_2023-09-23T07-03-23.990596.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2429739932885906,\n \"em_stderr\": 0.004392127579519805,\n \"f1\": 0.2939712667785233,\n \"f1_stderr\": 0.004382684089142145,\n \"acc\": 0.3664330383509068,\n \"acc_stderr\": 0.00868382013779556\n },\n \"harness|drop|3\": {\n \"em\": 0.2429739932885906,\n \"em_stderr\": 0.004392127579519805,\n \"f1\": 0.2939712667785233,\n \"f1_stderr\": 0.004382684089142145\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \"acc_stderr\": 0.0045482295338363475\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754772\n }\n}\n```", "repo_url": "https://huggingface.co/mosaicml/mpt-7b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T07_03_23.990596", "path": ["**/details_harness|drop|3_2023-09-23T07-03-23.990596.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T07-03-23.990596.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T07_03_23.990596", "path": ["**/details_harness|gsm8k|5_2023-09-23T07-03-23.990596.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T07-03-23.990596.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:01:10.556120.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:01:10.556120.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T07_03_23.990596", "path": ["**/details_harness|winogrande|5_2023-09-23T07-03-23.990596.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T07-03-23.990596.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T10_01_10.556120", "path": ["results_2023-07-20T10:01:10.556120.parquet"]}, {"split": "2023_09_23T07_03_23.990596", "path": ["results_2023-09-23T07-03-23.990596.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T07-03-23.990596.parquet"]}]}]}
2023-09-23T06:03:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mosaicml/mpt-7b-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model mosaicml/mpt-7b-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T07:03:23.990596(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of mosaicml/mpt-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T07:03:23.990596(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mosaicml/mpt-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T07:03:23.990596(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mosaicml/mpt-7b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T07:03:23.990596(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d5d37eceaa5e6143f8146b75e2d61c67ca29bf48
# Dataset Card for Evaluation run of mosaicml/mpt-7b-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mosaicml/mpt-7b-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T09:38:22.163645](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-chat/blob/main/results_2023-10-17T09-38-22.163645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.06952600671140939, "em_stderr": 0.002604746204517829, "f1": 0.12196937919463072, "f1_stderr": 0.002840521979064293, "acc": 0.3626168565432783, "acc_stderr": 0.009260585769647573 }, "harness|drop|3": { "em": 0.06952600671140939, "em_stderr": 0.002604746204517829, "f1": 0.12196937919463072, "f1_stderr": 0.002840521979064293 }, "harness|gsm8k|5": { "acc": 0.04094010614101592, "acc_stderr": 0.005458076796294338 }, "harness|winogrande|5": { "acc": 0.6842936069455406, "acc_stderr": 0.01306309474300081 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_mosaicml__mpt-7b-chat
[ "region:us" ]
2023-08-17T22:59:59+00:00
{"pretty_name": "Evaluation run of mosaicml/mpt-7b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T09:38:22.163645](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-chat/blob/main/results_2023-10-17T09-38-22.163645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06952600671140939,\n \"em_stderr\": 0.002604746204517829,\n \"f1\": 0.12196937919463072,\n \"f1_stderr\": 0.002840521979064293,\n \"acc\": 0.3626168565432783,\n \"acc_stderr\": 0.009260585769647573\n },\n \"harness|drop|3\": {\n \"em\": 0.06952600671140939,\n \"em_stderr\": 0.002604746204517829,\n \"f1\": 0.12196937919463072,\n \"f1_stderr\": 0.002840521979064293\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04094010614101592,\n \"acc_stderr\": 0.005458076796294338\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6842936069455406,\n \"acc_stderr\": 0.01306309474300081\n }\n}\n```", "repo_url": "https://huggingface.co/mosaicml/mpt-7b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T09_38_22.163645", "path": ["**/details_harness|drop|3_2023-10-17T09-38-22.163645.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T09-38-22.163645.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T09_38_22.163645", "path": ["**/details_harness|gsm8k|5_2023-10-17T09-38-22.163645.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T09-38-22.163645.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:00:41.356813.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:00:41.356813.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T09_38_22.163645", "path": ["**/details_harness|winogrande|5_2023-10-17T09-38-22.163645.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T09-38-22.163645.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T10_00_41.356813", "path": ["results_2023-07-20T10:00:41.356813.parquet"]}, {"split": "2023_10_17T09_38_22.163645", "path": ["results_2023-10-17T09-38-22.163645.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T09-38-22.163645.parquet"]}]}]}
2023-10-17T08:38:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mosaicml/mpt-7b-chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model mosaicml/mpt-7b-chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T09:38:22.163645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of mosaicml/mpt-7b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T09:38:22.163645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mosaicml/mpt-7b-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T09:38:22.163645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mosaicml/mpt-7b-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T09:38:22.163645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
564c4684dd93ecfcd4cc484e74f516d50bf03bb2
# Dataset Card for Evaluation run of mosaicml/mpt-7b-storywriter ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mosaicml/mpt-7b-storywriter - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-16T08:53:05.263222](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter/blob/main/results_2023-10-16T08-53-05.263222.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0006291946308724832, "em_stderr": 0.00025680027497237983, "f1": 0.0032026006711409396, "f1_stderr": 0.0005040610386397096, "acc": 0.2557221783741121, "acc_stderr": 0.0070244020999296625 }, "harness|drop|3": { "em": 0.0006291946308724832, "em_stderr": 0.00025680027497237983, "f1": 0.0032026006711409396, "f1_stderr": 0.0005040610386397096 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5114443567482242, "acc_stderr": 0.014048804199859325 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter
[ "region:us" ]
2023-08-17T23:00:09+00:00
{"pretty_name": "Evaluation run of mosaicml/mpt-7b-storywriter", "dataset_summary": "Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T08:53:05.263222](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter/blob/main/results_2023-10-16T08-53-05.263222.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.00025680027497237983,\n \"f1\": 0.0032026006711409396,\n \"f1_stderr\": 0.0005040610386397096,\n \"acc\": 0.2557221783741121,\n \"acc_stderr\": 0.0070244020999296625\n },\n \"harness|drop|3\": {\n \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.00025680027497237983,\n \"f1\": 0.0032026006711409396,\n \"f1_stderr\": 0.0005040610386397096\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5114443567482242,\n \"acc_stderr\": 0.014048804199859325\n }\n}\n```", "repo_url": "https://huggingface.co/mosaicml/mpt-7b-storywriter", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|arc:challenge|25_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T15_18_47.960530", "path": ["**/details_harness|drop|3_2023-09-22T15-18-47.960530.parquet"]}, {"split": "2023_10_16T08_53_05.263222", "path": ["**/details_harness|drop|3_2023-10-16T08-53-05.263222.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T08-53-05.263222.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T15_18_47.960530", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-18-47.960530.parquet"]}, {"split": "2023_10_16T08_53_05.263222", "path": ["**/details_harness|gsm8k|5_2023-10-16T08-53-05.263222.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T08-53-05.263222.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hellaswag|10_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:23:53.118062.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-03T22-53-23.133729.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-03T22-53-23.133729.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T15_18_47.960530", "path": ["**/details_harness|winogrande|5_2023-09-22T15-18-47.960530.parquet"]}, {"split": "2023_10_16T08_53_05.263222", "path": ["**/details_harness|winogrande|5_2023-10-16T08-53-05.263222.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T08-53-05.263222.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T10_23_53.118062", "path": ["results_2023-07-20T10:23:53.118062.parquet"]}, {"split": "2023_09_22T15_18_47.960530", "path": ["results_2023-09-22T15-18-47.960530.parquet"]}, {"split": "2023_10_03T22_53_23.133729", "path": ["results_2023-10-03T22-53-23.133729.parquet"]}, {"split": "2023_10_16T08_53_05.263222", "path": ["results_2023-10-16T08-53-05.263222.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T08-53-05.263222.parquet"]}]}]}
2023-10-16T07:53:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mosaicml/mpt-7b-storywriter ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model mosaicml/mpt-7b-storywriter on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-16T08:53:05.263222(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of mosaicml/mpt-7b-storywriter", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-storywriter on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T08:53:05.263222(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mosaicml/mpt-7b-storywriter", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-storywriter on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-16T08:53:05.263222(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mosaicml/mpt-7b-storywriter## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-7b-storywriter on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T08:53:05.263222(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d4062a9887375ca728f29a5c744f928424ac8f5f
# Dataset Card for Evaluation run of mosaicml/mpt-30b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mosaicml/mpt-30b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [mosaicml/mpt-30b](https://huggingface.co/mosaicml/mpt-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 121 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-30b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T21:16:10.122572](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-30b/blob/main/results_2023-12-04T21-16-10.122572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.48238773450038697, "acc_stderr": 0.03469207472678917, "acc_norm": 0.48716827364491283, "acc_norm_stderr": 0.03546745870449251, "mc1": 0.2582619339045288, "mc1_stderr": 0.015321821688476196, "mc2": 0.3841558252351552, "mc2_stderr": 0.013607507438444062 }, "harness|arc:challenge|25": { "acc": 0.5290102389078498, "acc_stderr": 0.014586776355294317, "acc_norm": 0.5597269624573379, "acc_norm_stderr": 0.014506769524804237 }, "harness|hellaswag|10": { "acc": 0.6195976897032464, "acc_stderr": 0.004844935327599206, "acc_norm": 0.8242381995618403, "acc_norm_stderr": 0.0037983950550215346 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.40789473684210525, "acc_stderr": 0.03999309712777471, "acc_norm": 0.40789473684210525, "acc_norm_stderr": 0.03999309712777471 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4867924528301887, "acc_stderr": 0.030762134874500476, "acc_norm": 0.4867924528301887, "acc_norm_stderr": 0.030762134874500476 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5138888888888888, "acc_stderr": 0.04179596617581, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.04179596617581 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4508670520231214, "acc_stderr": 0.037940126746970275, "acc_norm": 0.4508670520231214, "acc_norm_stderr": 0.037940126746970275 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077636, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.41702127659574467, "acc_stderr": 0.03223276266711712, "acc_norm": 0.41702127659574467, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2982456140350877, "acc_stderr": 0.04303684033537315, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.04303684033537315 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3306878306878307, "acc_stderr": 0.02422996529842509, "acc_norm": 0.3306878306878307, "acc_norm_stderr": 0.02422996529842509 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.0404061017820884, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.0404061017820884 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5419354838709678, "acc_stderr": 0.028343787250540632, "acc_norm": 0.5419354838709678, "acc_norm_stderr": 0.028343787250540632 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.35467980295566504, "acc_stderr": 0.0336612448905145, "acc_norm": 0.35467980295566504, "acc_norm_stderr": 0.0336612448905145 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6, "acc_stderr": 0.03825460278380026, "acc_norm": 0.6, "acc_norm_stderr": 0.03825460278380026 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5959595959595959, "acc_stderr": 0.03496130972056128, "acc_norm": 0.5959595959595959, "acc_norm_stderr": 0.03496130972056128 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6476683937823834, "acc_stderr": 0.03447478286414357, "acc_norm": 0.6476683937823834, "acc_norm_stderr": 0.03447478286414357 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.46923076923076923, "acc_stderr": 0.02530295889085015, "acc_norm": 0.46923076923076923, "acc_norm_stderr": 0.02530295889085015 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2814814814814815, "acc_stderr": 0.027420019350945284, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.027420019350945284 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.0322529423239964, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.0322529423239964 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6770642201834862, "acc_stderr": 0.020048115923415315, "acc_norm": 0.6770642201834862, "acc_norm_stderr": 0.020048115923415315 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.36574074074074076, "acc_stderr": 0.03284738857647208, "acc_norm": 0.36574074074074076, "acc_norm_stderr": 0.03284738857647208 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03308611113236436, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03308611113236436 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6708860759493671, "acc_stderr": 0.03058732629470236, "acc_norm": 0.6708860759493671, "acc_norm_stderr": 0.03058732629470236 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5381165919282511, "acc_stderr": 0.033460150119732274, "acc_norm": 0.5381165919282511, "acc_norm_stderr": 0.033460150119732274 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5343511450381679, "acc_stderr": 0.043749285605997376, "acc_norm": 0.5343511450381679, "acc_norm_stderr": 0.043749285605997376 }, "harness|hendrycksTest-international_law|5": { "acc": 0.4297520661157025, "acc_stderr": 0.04519082021319773, "acc_norm": 0.4297520661157025, "acc_norm_stderr": 0.04519082021319773 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4722222222222222, "acc_stderr": 0.048262172941398944, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.048262172941398944 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4785276073619632, "acc_stderr": 0.0392474687675113, "acc_norm": 0.4785276073619632, "acc_norm_stderr": 0.0392474687675113 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.5631067961165048, "acc_stderr": 0.04911147107365777, "acc_norm": 0.5631067961165048, "acc_norm_stderr": 0.04911147107365777 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7136752136752137, "acc_stderr": 0.029614323690456655, "acc_norm": 0.7136752136752137, "acc_norm_stderr": 0.029614323690456655 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6871008939974457, "acc_stderr": 0.016580935940304038, "acc_norm": 0.6871008939974457, "acc_norm_stderr": 0.016580935940304038 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5115606936416185, "acc_stderr": 0.026911898686377913, "acc_norm": 0.5115606936416185, "acc_norm_stderr": 0.026911898686377913 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.26927374301675977, "acc_stderr": 0.014835616582882606, "acc_norm": 0.26927374301675977, "acc_norm_stderr": 0.014835616582882606 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5098039215686274, "acc_stderr": 0.02862441255016795, "acc_norm": 0.5098039215686274, "acc_norm_stderr": 0.02862441255016795 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5466237942122186, "acc_stderr": 0.028274359854894245, "acc_norm": 0.5466237942122186, "acc_norm_stderr": 0.028274359854894245 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5679012345679012, "acc_stderr": 0.02756301097160668, "acc_norm": 0.5679012345679012, "acc_norm_stderr": 0.02756301097160668 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3617021276595745, "acc_stderr": 0.028663820147199492, "acc_norm": 0.3617021276595745, "acc_norm_stderr": 0.028663820147199492 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.37809647979139505, "acc_stderr": 0.012384878406798095, "acc_norm": 0.37809647979139505, "acc_norm_stderr": 0.012384878406798095 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.38235294117647056, "acc_stderr": 0.029520095697687765, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.029520095697687765 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4526143790849673, "acc_stderr": 0.020136790918492534, "acc_norm": 0.4526143790849673, "acc_norm_stderr": 0.020136790918492534 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5727272727272728, "acc_stderr": 0.04738198703545483, "acc_norm": 0.5727272727272728, "acc_norm_stderr": 0.04738198703545483 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5510204081632653, "acc_stderr": 0.03184213866687579, "acc_norm": 0.5510204081632653, "acc_norm_stderr": 0.03184213866687579 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5472636815920398, "acc_stderr": 0.035197027175769155, "acc_norm": 0.5472636815920398, "acc_norm_stderr": 0.035197027175769155 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.2582619339045288, "mc1_stderr": 0.015321821688476196, "mc2": 0.3841558252351552, "mc2_stderr": 0.013607507438444062 }, "harness|winogrande|5": { "acc": 0.7490134175217048, "acc_stderr": 0.01218577622051616 }, "harness|gsm8k|5": { "acc": 0.16906747536012132, "acc_stderr": 0.01032417144549735 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_mosaicml__mpt-30b
[ "region:us" ]
2023-08-17T23:00:27+00:00
{"pretty_name": "Evaluation run of mosaicml/mpt-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [mosaicml/mpt-30b](https://huggingface.co/mosaicml/mpt-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 121 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T21:16:10.122572](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-30b/blob/main/results_2023-12-04T21-16-10.122572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48238773450038697,\n \"acc_stderr\": 0.03469207472678917,\n \"acc_norm\": 0.48716827364491283,\n \"acc_norm_stderr\": 0.03546745870449251,\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.3841558252351552,\n \"mc2_stderr\": 0.013607507438444062\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294317,\n \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.014506769524804237\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6195976897032464,\n \"acc_stderr\": 0.004844935327599206,\n \"acc_norm\": 0.8242381995618403,\n \"acc_norm_stderr\": 0.0037983950550215346\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500476,\n \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500476\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.037940126746970275,\n \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.037940126746970275\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842509,\n \"acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842509\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5419354838709678,\n \"acc_stderr\": 0.028343787250540632,\n \"acc_norm\": 0.5419354838709678,\n \"acc_norm_stderr\": 0.028343787250540632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056128,\n \"acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056128\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.46923076923076923,\n \"acc_stderr\": 0.02530295889085015,\n \"acc_norm\": 0.46923076923076923,\n \"acc_norm_stderr\": 0.02530295889085015\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n \"acc_stderr\": 0.020048115923415315,\n \"acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.020048115923415315\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647208,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647208\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6708860759493671,\n \"acc_stderr\": 0.03058732629470236,\n \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.03058732629470236\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4297520661157025,\n \"acc_stderr\": 0.04519082021319773,\n \"acc_norm\": 0.4297520661157025,\n \"acc_norm_stderr\": 0.04519082021319773\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n \"acc_stderr\": 0.029614323690456655,\n \"acc_norm\": 0.7136752136752137,\n \"acc_norm_stderr\": 0.029614323690456655\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6871008939974457,\n \"acc_stderr\": 0.016580935940304038,\n \"acc_norm\": 0.6871008939974457,\n \"acc_norm_stderr\": 0.016580935940304038\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377913,\n \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377913\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26927374301675977,\n \"acc_stderr\": 0.014835616582882606,\n \"acc_norm\": 0.26927374301675977,\n \"acc_norm_stderr\": 0.014835616582882606\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.02862441255016795,\n \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.02862441255016795\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n \"acc_stderr\": 0.028274359854894245,\n \"acc_norm\": 0.5466237942122186,\n \"acc_norm_stderr\": 0.028274359854894245\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.02756301097160668,\n \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.02756301097160668\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37809647979139505,\n \"acc_stderr\": 0.012384878406798095,\n \"acc_norm\": 0.37809647979139505,\n \"acc_norm_stderr\": 0.012384878406798095\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687765,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687765\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4526143790849673,\n \"acc_stderr\": 0.020136790918492534,\n \"acc_norm\": 0.4526143790849673,\n \"acc_norm_stderr\": 0.020136790918492534\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n \"acc_stderr\": 0.035197027175769155,\n \"acc_norm\": 0.5472636815920398,\n \"acc_norm_stderr\": 0.035197027175769155\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.3841558252351552,\n \"mc2_stderr\": 0.013607507438444062\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.01218577622051616\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16906747536012132,\n \"acc_stderr\": 0.01032417144549735\n }\n}\n```", "repo_url": "https://huggingface.co/mosaicml/mpt-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|arc:challenge|25_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|arc:challenge|25_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|gsm8k|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hellaswag|10_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hellaswag|10_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T13:09:09.001286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T21-16-10.122572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T21_16_10.122572", "path": ["**/details_harness|winogrande|5_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T21-16-10.122572.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:30:08.303629.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_30_08.303629", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:30:08.303629.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:30:08.303629.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T13_09_09.001286", "path": ["results_2023-07-20T13:09:09.001286.parquet"]}, {"split": "2023_08_28T20_30_08.303629", "path": ["results_2023-08-28T20:30:08.303629.parquet"]}, {"split": "2023_12_04T21_16_10.122572", "path": ["results_2023-12-04T21-16-10.122572.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T21-16-10.122572.parquet"]}]}]}
2023-12-04T21:19:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mosaicml/mpt-30b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model mosaicml/mpt-30b on the Open LLM Leaderboard. The dataset is composed of 121 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T21:16:10.122572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of mosaicml/mpt-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 121 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T21:16:10.122572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mosaicml/mpt-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 121 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T21:16:10.122572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 16, 31, 165, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mosaicml/mpt-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 121 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T21:16:10.122572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
46f691d398b2def4b8e7eb1b5acd07195ecb4481
# Dataset Card for Evaluation run of mosaicml/mpt-30b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mosaicml/mpt-30b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [mosaicml/mpt-30b-instruct](https://huggingface.co/mosaicml/mpt-30b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-30b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T20:57:09.846204](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-30b-instruct/blob/main/results_2023-10-14T20-57-09.846204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3308515100671141, "em_stderr": 0.004818562129043009, "f1": 0.38283766778523554, "f1_stderr": 0.00472140525052066, "acc": 0.4522637692207808, "acc_stderr": 0.011033521433097288 }, "harness|drop|3": { "em": 0.3308515100671141, "em_stderr": 0.004818562129043009, "f1": 0.38283766778523554, "f1_stderr": 0.00472140525052066 }, "harness|gsm8k|5": { "acc": 0.15314632297194844, "acc_stderr": 0.009919728152791466 }, "harness|winogrande|5": { "acc": 0.7513812154696132, "acc_stderr": 0.012147314713403112 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_mosaicml__mpt-30b-instruct
[ "region:us" ]
2023-08-17T23:00:36+00:00
{"pretty_name": "Evaluation run of mosaicml/mpt-30b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [mosaicml/mpt-30b-instruct](https://huggingface.co/mosaicml/mpt-30b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-30b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T20:57:09.846204](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-30b-instruct/blob/main/results_2023-10-14T20-57-09.846204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3308515100671141,\n \"em_stderr\": 0.004818562129043009,\n \"f1\": 0.38283766778523554,\n \"f1_stderr\": 0.00472140525052066,\n \"acc\": 0.4522637692207808,\n \"acc_stderr\": 0.011033521433097288\n },\n \"harness|drop|3\": {\n \"em\": 0.3308515100671141,\n \"em_stderr\": 0.004818562129043009,\n \"f1\": 0.38283766778523554,\n \"f1_stderr\": 0.00472140525052066\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15314632297194844,\n \"acc_stderr\": 0.009919728152791466\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403112\n }\n}\n```", "repo_url": "https://huggingface.co/mosaicml/mpt-30b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|arc:challenge|25_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T20_57_09.846204", "path": ["**/details_harness|drop|3_2023-10-14T20-57-09.846204.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T20-57-09.846204.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T20_57_09.846204", "path": ["**/details_harness|gsm8k|5_2023-10-14T20-57-09.846204.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T20-57-09.846204.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hellaswag|10_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T13:11:24.937399.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T13:11:24.937399.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T13:11:24.937399.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T20_57_09.846204", "path": ["**/details_harness|winogrande|5_2023-10-14T20-57-09.846204.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T20-57-09.846204.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T13_11_24.937399", "path": ["results_2023-07-20T13:11:24.937399.parquet"]}, {"split": "2023_10_14T20_57_09.846204", "path": ["results_2023-10-14T20-57-09.846204.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T20-57-09.846204.parquet"]}]}]}
2023-10-14T19:57:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mosaicml/mpt-30b-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model mosaicml/mpt-30b-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T20:57:09.846204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of mosaicml/mpt-30b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T20:57:09.846204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mosaicml/mpt-30b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T20:57:09.846204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mosaicml/mpt-30b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mosaicml/mpt-30b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T20:57:09.846204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
28fb6cb87a2c5f59a035340b6e8ca3b1977b0ec5
# Dataset Card for Evaluation run of clibrain/Llama-2-7b-ft-instruct-es ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/clibrain/Llama-2-7b-ft-instruct-es - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [clibrain/Llama-2-7b-ft-instruct-es](https://huggingface.co/clibrain/Llama-2-7b-ft-instruct-es) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_clibrain__Llama-2-7b-ft-instruct-es", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T14:05:09.748904](https://huggingface.co/datasets/open-llm-leaderboard/details_clibrain__Llama-2-7b-ft-instruct-es/blob/main/results_2023-09-17T14-05-09.748904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001363255033557047, "em_stderr": 0.00037786091964606556, "f1": 0.059617239932886215, "f1_stderr": 0.0013507073733013888, "acc": 0.4045158699907191, "acc_stderr": 0.009256588130982506 }, "harness|drop|3": { "em": 0.001363255033557047, "em_stderr": 0.00037786091964606556, "f1": 0.059617239932886215, "f1_stderr": 0.0013507073733013888 }, "harness|gsm8k|5": { "acc": 0.05686125852918878, "acc_stderr": 0.006378790242099664 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.01213438601986535 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_clibrain__Llama-2-7b-ft-instruct-es
[ "region:us" ]
2023-08-17T23:00:44+00:00
{"pretty_name": "Evaluation run of clibrain/Llama-2-7b-ft-instruct-es", "dataset_summary": "Dataset automatically created during the evaluation run of model [clibrain/Llama-2-7b-ft-instruct-es](https://huggingface.co/clibrain/Llama-2-7b-ft-instruct-es) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_clibrain__Llama-2-7b-ft-instruct-es\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T14:05:09.748904](https://huggingface.co/datasets/open-llm-leaderboard/details_clibrain__Llama-2-7b-ft-instruct-es/blob/main/results_2023-09-17T14-05-09.748904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606556,\n \"f1\": 0.059617239932886215,\n \"f1_stderr\": 0.0013507073733013888,\n \"acc\": 0.4045158699907191,\n \"acc_stderr\": 0.009256588130982506\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606556,\n \"f1\": 0.059617239932886215,\n \"f1_stderr\": 0.0013507073733013888\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05686125852918878,\n \"acc_stderr\": 0.006378790242099664\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n }\n}\n```", "repo_url": "https://huggingface.co/clibrain/Llama-2-7b-ft-instruct-es", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|arc:challenge|25_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T14_05_09.748904", "path": ["**/details_harness|drop|3_2023-09-17T14-05-09.748904.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T14-05-09.748904.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T14_05_09.748904", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-05-09.748904.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-05-09.748904.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hellaswag|10_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T22:51:22.839971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T22:51:22.839971.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T22:51:22.839971.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T14_05_09.748904", "path": ["**/details_harness|winogrande|5_2023-09-17T14-05-09.748904.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T14-05-09.748904.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T22_51_22.839971", "path": ["results_2023-08-09T22:51:22.839971.parquet"]}, {"split": "2023_09_17T14_05_09.748904", "path": ["results_2023-09-17T14-05-09.748904.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T14-05-09.748904.parquet"]}]}]}
2023-09-17T13:05:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of clibrain/Llama-2-7b-ft-instruct-es ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model clibrain/Llama-2-7b-ft-instruct-es on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T14:05:09.748904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of clibrain/Llama-2-7b-ft-instruct-es", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model clibrain/Llama-2-7b-ft-instruct-es on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T14:05:09.748904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of clibrain/Llama-2-7b-ft-instruct-es", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model clibrain/Llama-2-7b-ft-instruct-es on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T14:05:09.748904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of clibrain/Llama-2-7b-ft-instruct-es## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model clibrain/Llama-2-7b-ft-instruct-es on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T14:05:09.748904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2cd0462ac189437e59aee2d1d29fcf26ca743501
# Dataset Card for Evaluation run of deepnight-research/llama-2-70B-inst ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/deepnight-research/llama-2-70B-inst - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [deepnight-research/llama-2-70B-inst](https://huggingface.co/deepnight-research/llama-2-70B-inst) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-10T00:26:50.478989](https://huggingface.co/datasets/open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst/blob/main/results_2023-08-10T00%3A26%3A50.478989.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7050740464217434, "acc_stderr": 0.03085018588043536, "acc_norm": 0.7087855823993987, "acc_norm_stderr": 0.03081992944181276, "mc1": 0.44430844553243576, "mc1_stderr": 0.017394586250743173, "mc2": 0.6224972679005382, "mc2_stderr": 0.014880875055625352 }, "harness|arc:challenge|25": { "acc": 0.6732081911262798, "acc_stderr": 0.013706665975587333, "acc_norm": 0.7107508532423208, "acc_norm_stderr": 0.013250012579393441 }, "harness|hellaswag|10": { "acc": 0.6974706233817964, "acc_stderr": 0.00458414401465495, "acc_norm": 0.8789085839474209, "acc_norm_stderr": 0.0032556675321152857 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8421052631578947, "acc_stderr": 0.029674167520101453, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.029674167520101453 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.044084400227680794, "acc_norm": 0.74, "acc_norm_stderr": 0.044084400227680794 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7320754716981132, "acc_stderr": 0.027257260322494845, "acc_norm": 0.7320754716981132, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8402777777777778, "acc_stderr": 0.030635578972093274, "acc_norm": 0.8402777777777778, "acc_norm_stderr": 0.030635578972093274 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.036146654241808254, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7063829787234043, "acc_stderr": 0.029771642712491227, "acc_norm": 0.7063829787234043, "acc_norm_stderr": 0.029771642712491227 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.04692008381368909, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.04692008381368909 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6482758620689655, "acc_stderr": 0.0397923663749741, "acc_norm": 0.6482758620689655, "acc_norm_stderr": 0.0397923663749741 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.46825396825396826, "acc_stderr": 0.025699352832131792, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.025699352832131792 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677173, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8096774193548387, "acc_stderr": 0.02233170761182307, "acc_norm": 0.8096774193548387, "acc_norm_stderr": 0.02233170761182307 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5615763546798029, "acc_stderr": 0.03491207857486519, "acc_norm": 0.5615763546798029, "acc_norm_stderr": 0.03491207857486519 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8424242424242424, "acc_stderr": 0.02845038880528436, "acc_norm": 0.8424242424242424, "acc_norm_stderr": 0.02845038880528436 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8737373737373737, "acc_stderr": 0.023664359402880242, "acc_norm": 0.8737373737373737, "acc_norm_stderr": 0.023664359402880242 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9378238341968912, "acc_stderr": 0.017426974154240528, "acc_norm": 0.9378238341968912, "acc_norm_stderr": 0.017426974154240528 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7102564102564103, "acc_stderr": 0.023000628243687968, "acc_norm": 0.7102564102564103, "acc_norm_stderr": 0.023000628243687968 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.028406533090608463, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.028406533090608463 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7647058823529411, "acc_stderr": 0.02755361446786381, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.02755361446786381 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.47019867549668876, "acc_stderr": 0.04075224992216979, "acc_norm": 0.47019867549668876, "acc_norm_stderr": 0.04075224992216979 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9027522935779817, "acc_stderr": 0.012703533408540366, "acc_norm": 0.9027522935779817, "acc_norm_stderr": 0.012703533408540366 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6018518518518519, "acc_stderr": 0.033384734032074016, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9264705882352942, "acc_stderr": 0.01831885585008968, "acc_norm": 0.9264705882352942, "acc_norm_stderr": 0.01831885585008968 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8945147679324894, "acc_stderr": 0.01999556072375854, "acc_norm": 0.8945147679324894, "acc_norm_stderr": 0.01999556072375854 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7937219730941704, "acc_stderr": 0.02715715047956382, "acc_norm": 0.7937219730941704, "acc_norm_stderr": 0.02715715047956382 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8625954198473282, "acc_stderr": 0.030194823996804475, "acc_norm": 0.8625954198473282, "acc_norm_stderr": 0.030194823996804475 }, "harness|hendrycksTest-international_law|5": { "acc": 0.859504132231405, "acc_stderr": 0.03172233426002157, "acc_norm": 0.859504132231405, "acc_norm_stderr": 0.03172233426002157 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.803680981595092, "acc_stderr": 0.031207970394709218, "acc_norm": 0.803680981595092, "acc_norm_stderr": 0.031207970394709218 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489122, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489122 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.0376017800602662, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.0376017800602662 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9017094017094017, "acc_stderr": 0.019503444900757567, "acc_norm": 0.9017094017094017, "acc_norm_stderr": 0.019503444900757567 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8684546615581098, "acc_stderr": 0.01208670521425043, "acc_norm": 0.8684546615581098, "acc_norm_stderr": 0.01208670521425043 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7803468208092486, "acc_stderr": 0.022289638852617893, "acc_norm": 0.7803468208092486, "acc_norm_stderr": 0.022289638852617893 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6044692737430167, "acc_stderr": 0.01635341541007577, "acc_norm": 0.6044692737430167, "acc_norm_stderr": 0.01635341541007577 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7679738562091504, "acc_stderr": 0.024170840879340873, "acc_norm": 0.7679738562091504, "acc_norm_stderr": 0.024170840879340873 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7781350482315113, "acc_stderr": 0.02359885829286305, "acc_norm": 0.7781350482315113, "acc_norm_stderr": 0.02359885829286305 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8333333333333334, "acc_stderr": 0.020736358408060006, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.020736358408060006 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.574468085106383, "acc_stderr": 0.029494827600144366, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.029494827600144366 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5521512385919165, "acc_stderr": 0.012700582404768235, "acc_norm": 0.5521512385919165, "acc_norm_stderr": 0.012700582404768235 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7389705882352942, "acc_stderr": 0.02667925227010314, "acc_norm": 0.7389705882352942, "acc_norm_stderr": 0.02667925227010314 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7647058823529411, "acc_stderr": 0.01716058723504635, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.01716058723504635 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7454545454545455, "acc_stderr": 0.041723430387053825, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8204081632653061, "acc_stderr": 0.024573293589585637, "acc_norm": 0.8204081632653061, "acc_norm_stderr": 0.024573293589585637 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8756218905472637, "acc_stderr": 0.023335401790166327, "acc_norm": 0.8756218905472637, "acc_norm_stderr": 0.023335401790166327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8771929824561403, "acc_stderr": 0.02517298435015575, "acc_norm": 0.8771929824561403, "acc_norm_stderr": 0.02517298435015575 }, "harness|truthfulqa:mc|0": { "mc1": 0.44430844553243576, "mc1_stderr": 0.017394586250743173, "mc2": 0.6224972679005382, "mc2_stderr": 0.014880875055625352 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst
[ "region:us" ]
2023-08-17T23:00:54+00:00
{"pretty_name": "Evaluation run of deepnight-research/llama-2-70B-inst", "dataset_summary": "Dataset automatically created during the evaluation run of model [deepnight-research/llama-2-70B-inst](https://huggingface.co/deepnight-research/llama-2-70B-inst) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-10T00:26:50.478989](https://huggingface.co/datasets/open-llm-leaderboard/details_deepnight-research__llama-2-70B-inst/blob/main/results_2023-08-10T00%3A26%3A50.478989.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7050740464217434,\n \"acc_stderr\": 0.03085018588043536,\n \"acc_norm\": 0.7087855823993987,\n \"acc_norm_stderr\": 0.03081992944181276,\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n \"mc2_stderr\": 0.014880875055625352\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587333,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6974706233817964,\n \"acc_stderr\": 0.00458414401465495,\n \"acc_norm\": 0.8789085839474209,\n \"acc_norm_stderr\": 0.0032556675321152857\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.029674167520101453,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.029674167520101453\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131792,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131792\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5615763546798029,\n \"acc_stderr\": 0.03491207857486519,\n \"acc_norm\": 0.5615763546798029,\n \"acc_norm_stderr\": 0.03491207857486519\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880242,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880242\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.04075224992216979,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.04075224992216979\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6044692737430167,\n \"acc_stderr\": 0.01635341541007577,\n \"acc_norm\": 0.6044692737430167,\n \"acc_norm_stderr\": 0.01635341541007577\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060006,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060006\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5521512385919165,\n \"acc_stderr\": 0.012700582404768235,\n \"acc_norm\": 0.5521512385919165,\n \"acc_norm_stderr\": 0.012700582404768235\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n \"mc2_stderr\": 0.014880875055625352\n }\n}\n```", "repo_url": "https://huggingface.co/deepnight-research/llama-2-70B-inst", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:26:50.478989.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T00_26_50.478989", "path": ["results_2023-08-10T00:26:50.478989.parquet"]}, {"split": "latest", "path": ["results_2023-08-10T00:26:50.478989.parquet"]}]}]}
2023-08-27T11:26:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of deepnight-research/llama-2-70B-inst ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model deepnight-research/llama-2-70B-inst on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-10T00:26:50.478989 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of deepnight-research/llama-2-70B-inst", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepnight-research/llama-2-70B-inst on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-10T00:26:50.478989 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of deepnight-research/llama-2-70B-inst", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepnight-research/llama-2-70B-inst on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-10T00:26:50.478989 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of deepnight-research/llama-2-70B-inst## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepnight-research/llama-2-70B-inst on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-10T00:26:50.478989 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
60fe5810827b158f4045703f4e6735e8a84a97c4
# Dataset Card for Evaluation run of quantumaikr/open_llama_7b_hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/quantumaikr/open_llama_7b_hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [quantumaikr/open_llama_7b_hf](https://huggingface.co/quantumaikr/open_llama_7b_hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-07-19T17:01:48.631436](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf/blob/main/results_2023-07-19T17%3A01%3A48.631436.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2648279332452004, "acc_stderr": 0.03195749858994142, "acc_norm": 0.26548960439125074, "acc_norm_stderr": 0.03196726632461042, "mc1": 0.23133414932680538, "mc1_stderr": 0.014761945174862661, "mc2": 0.4954484536663258, "mc2_stderr": 0.016312743256662564 }, "harness|arc:challenge|25": { "acc": 0.23293515358361774, "acc_stderr": 0.012352507042617391, "acc_norm": 0.2645051194539249, "acc_norm_stderr": 0.012889272949313366 }, "harness|hellaswag|10": { "acc": 0.26199960167297354, "acc_stderr": 0.004388237557526716, "acc_norm": 0.26946823341963755, "acc_norm_stderr": 0.004427767996301633 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.31851851851851853, "acc_stderr": 0.040247784019771096, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.040247784019771096 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2894736842105263, "acc_stderr": 0.036906779861372814, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.036906779861372814 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2490566037735849, "acc_stderr": 0.026616482980501704, "acc_norm": 0.2490566037735849, "acc_norm_stderr": 0.026616482980501704 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2708333333333333, "acc_stderr": 0.03716177437566016, "acc_norm": 0.2708333333333333, "acc_norm_stderr": 0.03716177437566016 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.23699421965317918, "acc_stderr": 0.03242414757483098, "acc_norm": 0.23699421965317918, "acc_norm_stderr": 0.03242414757483098 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.16, "acc_stderr": 0.03684529491774708, "acc_norm": 0.16, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2425531914893617, "acc_stderr": 0.028020226271200214, "acc_norm": 0.2425531914893617, "acc_norm_stderr": 0.028020226271200214 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.23448275862068965, "acc_stderr": 0.035306258743465914, "acc_norm": 0.23448275862068965, "acc_norm_stderr": 0.035306258743465914 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25132275132275134, "acc_stderr": 0.022340482339643898, "acc_norm": 0.25132275132275134, "acc_norm_stderr": 0.022340482339643898 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2222222222222222, "acc_stderr": 0.037184890068181146, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.037184890068181146 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3161290322580645, "acc_stderr": 0.02645087448904277, "acc_norm": 0.3161290322580645, "acc_norm_stderr": 0.02645087448904277 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.31527093596059114, "acc_stderr": 0.03269080871970186, "acc_norm": 0.31527093596059114, "acc_norm_stderr": 0.03269080871970186 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.24848484848484848, "acc_stderr": 0.03374402644139405, "acc_norm": 0.24848484848484848, "acc_norm_stderr": 0.03374402644139405 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.31313131313131315, "acc_stderr": 0.033042050878136525, "acc_norm": 0.31313131313131315, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.3005181347150259, "acc_stderr": 0.0330881859441575, "acc_norm": 0.3005181347150259, "acc_norm_stderr": 0.0330881859441575 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2128205128205128, "acc_stderr": 0.020752423722128013, "acc_norm": 0.2128205128205128, "acc_norm_stderr": 0.020752423722128013 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.25555555555555554, "acc_stderr": 0.02659393910184408, "acc_norm": 0.25555555555555554, "acc_norm_stderr": 0.02659393910184408 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3277310924369748, "acc_stderr": 0.030489911417673227, "acc_norm": 0.3277310924369748, "acc_norm_stderr": 0.030489911417673227 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.28440366972477066, "acc_stderr": 0.019342036587702588, "acc_norm": 0.28440366972477066, "acc_norm_stderr": 0.019342036587702588 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.03350991604696042, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.03350991604696042 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25980392156862747, "acc_stderr": 0.030778554678693254, "acc_norm": 0.25980392156862747, "acc_norm_stderr": 0.030778554678693254 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.189873417721519, "acc_stderr": 0.025530100460233497, "acc_norm": 0.189873417721519, "acc_norm_stderr": 0.025530100460233497 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.18834080717488788, "acc_stderr": 0.026241132996407273, "acc_norm": 0.18834080717488788, "acc_norm_stderr": 0.026241132996407273 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.22900763358778625, "acc_stderr": 0.036853466317118506, "acc_norm": 0.22900763358778625, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.19008264462809918, "acc_stderr": 0.03581796951709282, "acc_norm": 0.19008264462809918, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.21296296296296297, "acc_stderr": 0.0395783547198098, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22699386503067484, "acc_stderr": 0.032910995786157686, "acc_norm": 0.22699386503067484, "acc_norm_stderr": 0.032910995786157686 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.04007341809755806, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.04007341809755806 }, "harness|hendrycksTest-management|5": { "acc": 0.3592233009708738, "acc_stderr": 0.04750458399041692, "acc_norm": 0.3592233009708738, "acc_norm_stderr": 0.04750458399041692 }, "harness|hendrycksTest-marketing|5": { "acc": 0.20085470085470086, "acc_stderr": 0.026246772946890477, "acc_norm": 0.20085470085470086, "acc_norm_stderr": 0.026246772946890477 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2656449553001277, "acc_stderr": 0.01579430248788873, "acc_norm": 0.2656449553001277, "acc_norm_stderr": 0.01579430248788873 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2138728323699422, "acc_stderr": 0.022075709251757183, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.022075709251757183 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.01446589382985993, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.01446589382985993 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.27450980392156865, "acc_stderr": 0.025553169991826514, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.025553169991826514 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2861736334405145, "acc_stderr": 0.025670259242188943, "acc_norm": 0.2861736334405145, "acc_norm_stderr": 0.025670259242188943 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2623456790123457, "acc_stderr": 0.024477222856135104, "acc_norm": 0.2623456790123457, "acc_norm_stderr": 0.024477222856135104 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.25886524822695034, "acc_stderr": 0.026129572527180844, "acc_norm": 0.25886524822695034, "acc_norm_stderr": 0.026129572527180844 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2438070404172099, "acc_stderr": 0.010966507972178479, "acc_norm": 0.2438070404172099, "acc_norm_stderr": 0.010966507972178479 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.030211479609121593, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.030211479609121593 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24673202614379086, "acc_stderr": 0.0174408203674025, "acc_norm": 0.24673202614379086, "acc_norm_stderr": 0.0174408203674025 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.24545454545454545, "acc_stderr": 0.041220665028782834, "acc_norm": 0.24545454545454545, "acc_norm_stderr": 0.041220665028782834 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2938775510204082, "acc_stderr": 0.029162738410249762, "acc_norm": 0.2938775510204082, "acc_norm_stderr": 0.029162738410249762 }, "harness|hendrycksTest-sociology|5": { "acc": 0.23383084577114427, "acc_stderr": 0.029929415408348405, "acc_norm": 0.23383084577114427, "acc_norm_stderr": 0.029929415408348405 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-virology|5": { "acc": 0.15060240963855423, "acc_stderr": 0.02784386378726433, "acc_norm": 0.15060240963855423, "acc_norm_stderr": 0.02784386378726433 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.23976608187134502, "acc_stderr": 0.03274485211946956, "acc_norm": 0.23976608187134502, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.23133414932680538, "mc1_stderr": 0.014761945174862661, "mc2": 0.4954484536663258, "mc2_stderr": 0.016312743256662564 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf
[ "region:us" ]
2023-08-17T23:01:02+00:00
{"pretty_name": "Evaluation run of quantumaikr/open_llama_7b_hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [quantumaikr/open_llama_7b_hf](https://huggingface.co/quantumaikr/open_llama_7b_hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-19T17:01:48.631436](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf/blob/main/results_2023-07-19T17%3A01%3A48.631436.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2648279332452004,\n \"acc_stderr\": 0.03195749858994142,\n \"acc_norm\": 0.26548960439125074,\n \"acc_norm_stderr\": 0.03196726632461042,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.4954484536663258,\n \"mc2_stderr\": 0.016312743256662564\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23293515358361774,\n \"acc_stderr\": 0.012352507042617391,\n \"acc_norm\": 0.2645051194539249,\n \"acc_norm_stderr\": 0.012889272949313366\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26199960167297354,\n \"acc_stderr\": 0.004388237557526716,\n \"acc_norm\": 0.26946823341963755,\n \"acc_norm_stderr\": 0.004427767996301633\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501704,\n \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501704\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200214,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200214\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.0330881859441575,\n \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.0330881859441575\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184408,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184408\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.28440366972477066,\n \"acc_stderr\": 0.019342036587702588,\n \"acc_norm\": 0.28440366972477066,\n \"acc_norm_stderr\": 0.019342036587702588\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693254,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693254\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.189873417721519,\n \"acc_stderr\": 0.025530100460233497,\n \"acc_norm\": 0.189873417721519,\n \"acc_norm_stderr\": 0.025530100460233497\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n \"acc_stderr\": 0.026241132996407273,\n \"acc_norm\": 0.18834080717488788,\n \"acc_norm_stderr\": 0.026241132996407273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.04750458399041692,\n \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.04750458399041692\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20085470085470086,\n \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.20085470085470086,\n \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n \"acc_stderr\": 0.01579430248788873,\n \"acc_norm\": 0.2656449553001277,\n \"acc_norm_stderr\": 0.01579430248788873\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826514,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826514\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n \"acc_stderr\": 0.025670259242188943,\n \"acc_norm\": 0.2861736334405145,\n \"acc_norm_stderr\": 0.025670259242188943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135104,\n \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135104\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n \"acc_stderr\": 0.010966507972178479,\n \"acc_norm\": 0.2438070404172099,\n \"acc_norm_stderr\": 0.010966507972178479\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2938775510204082,\n \"acc_stderr\": 0.029162738410249762,\n \"acc_norm\": 0.2938775510204082,\n \"acc_norm_stderr\": 0.029162738410249762\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348405,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348405\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.15060240963855423,\n \"acc_stderr\": 0.02784386378726433,\n \"acc_norm\": 0.15060240963855423,\n \"acc_norm_stderr\": 0.02784386378726433\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.4954484536663258,\n \"mc2_stderr\": 0.016312743256662564\n }\n}\n```", "repo_url": "https://huggingface.co/quantumaikr/open_llama_7b_hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:01:48.631436.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_01_48.631436", "path": ["results_2023-07-19T17:01:48.631436.parquet"]}, {"split": "latest", "path": ["results_2023-07-19T17:01:48.631436.parquet"]}]}]}
2023-08-27T11:26:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of quantumaikr/open_llama_7b_hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model quantumaikr/open_llama_7b_hf on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-07-19T17:01:48.631436 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of quantumaikr/open_llama_7b_hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/open_llama_7b_hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-19T17:01:48.631436 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of quantumaikr/open_llama_7b_hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/open_llama_7b_hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-07-19T17:01:48.631436 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of quantumaikr/open_llama_7b_hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/open_llama_7b_hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-19T17:01:48.631436 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2328c39c47afadfc904ba4f5a7c5aa48f339f8c3
# Dataset Card for Evaluation run of quantumaikr/KoreanLM-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/quantumaikr/KoreanLM-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [quantumaikr/KoreanLM-hf](https://huggingface.co/quantumaikr/KoreanLM-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_quantumaikr__KoreanLM-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T23:49:19.113066](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__KoreanLM-hf/blob/main/results_2023-10-15T23-49-19.113066.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.16222734899328858, "em_stderr": 0.0037754156899711395, "f1": 0.21587562919463021, "f1_stderr": 0.0038257903227116702, "acc": 0.36591394188393417, "acc_stderr": 0.008953706481200412 }, "harness|drop|3": { "em": 0.16222734899328858, "em_stderr": 0.0037754156899711395, "f1": 0.21587562919463021, "f1_stderr": 0.0038257903227116702 }, "harness|gsm8k|5": { "acc": 0.03411675511751327, "acc_stderr": 0.005000212600773284 }, "harness|winogrande|5": { "acc": 0.6977111286503551, "acc_stderr": 0.012907200361627538 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_quantumaikr__KoreanLM-hf
[ "region:us" ]
2023-08-17T23:01:11+00:00
{"pretty_name": "Evaluation run of quantumaikr/KoreanLM-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [quantumaikr/KoreanLM-hf](https://huggingface.co/quantumaikr/KoreanLM-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__KoreanLM-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T23:49:19.113066](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__KoreanLM-hf/blob/main/results_2023-10-15T23-49-19.113066.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.16222734899328858,\n \"em_stderr\": 0.0037754156899711395,\n \"f1\": 0.21587562919463021,\n \"f1_stderr\": 0.0038257903227116702,\n \"acc\": 0.36591394188393417,\n \"acc_stderr\": 0.008953706481200412\n },\n \"harness|drop|3\": {\n \"em\": 0.16222734899328858,\n \"em_stderr\": 0.0037754156899711395,\n \"f1\": 0.21587562919463021,\n \"f1_stderr\": 0.0038257903227116702\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \"acc_stderr\": 0.005000212600773284\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6977111286503551,\n \"acc_stderr\": 0.012907200361627538\n }\n}\n```", "repo_url": "https://huggingface.co/quantumaikr/KoreanLM-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T23_49_19.113066", "path": ["**/details_harness|drop|3_2023-10-15T23-49-19.113066.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T23-49-19.113066.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T23_49_19.113066", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-49-19.113066.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-49-19.113066.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:37:48.867178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:37:48.867178.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:37:48.867178.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T23_49_19.113066", "path": ["**/details_harness|winogrande|5_2023-10-15T23-49-19.113066.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T23-49-19.113066.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T15_37_48.867178", "path": ["results_2023-07-24T15:37:48.867178.parquet"]}, {"split": "2023_10_15T23_49_19.113066", "path": ["results_2023-10-15T23-49-19.113066.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T23-49-19.113066.parquet"]}]}]}
2023-10-15T22:49:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of quantumaikr/KoreanLM-hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model quantumaikr/KoreanLM-hf on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T23:49:19.113066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of quantumaikr/KoreanLM-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/KoreanLM-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T23:49:19.113066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of quantumaikr/KoreanLM-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/KoreanLM-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T23:49:19.113066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of quantumaikr/KoreanLM-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/KoreanLM-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T23:49:19.113066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
00f617826441bea2161952e76db8c36d3eccebd5
# Dataset Card for Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-70b-fb16-guanaco-1k](https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-10T00:33:03.607588](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k/blob/main/results_2023-08-10T00%3A33%3A03.607588.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7013441332798022, "acc_stderr": 0.03091715385865452, "acc_norm": 0.7054300239648517, "acc_norm_stderr": 0.030884754243271178, "mc1": 0.40636474908200737, "mc1_stderr": 0.0171938358120939, "mc2": 0.5756052671501329, "mc2_stderr": 0.014559658555893657 }, "harness|arc:challenge|25": { "acc": 0.6510238907849829, "acc_stderr": 0.013928933461382501, "acc_norm": 0.7047781569965871, "acc_norm_stderr": 0.013329750293382318 }, "harness|hellaswag|10": { "acc": 0.686018721370245, "acc_stderr": 0.004631603539751948, "acc_norm": 0.8733320055765784, "acc_norm_stderr": 0.00331920940013512 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.041539484047424, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.041539484047424 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7894736842105263, "acc_stderr": 0.03317672787533157, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.03317672787533157 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7320754716981132, "acc_stderr": 0.027257260322494845, "acc_norm": 0.7320754716981132, "acc_norm_stderr": 0.027257260322494845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8402777777777778, "acc_stderr": 0.030635578972093274, "acc_norm": 0.8402777777777778, "acc_norm_stderr": 0.030635578972093274 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736413, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736413 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6638297872340425, "acc_stderr": 0.030881618520676942, "acc_norm": 0.6638297872340425, "acc_norm_stderr": 0.030881618520676942 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.04685473041907789, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.04685473041907789 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6275862068965518, "acc_stderr": 0.04028731532947559, "acc_norm": 0.6275862068965518, "acc_norm_stderr": 0.04028731532947559 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4497354497354497, "acc_stderr": 0.02562085704293665, "acc_norm": 0.4497354497354497, "acc_norm_stderr": 0.02562085704293665 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8354838709677419, "acc_stderr": 0.021090847745939306, "acc_norm": 0.8354838709677419, "acc_norm_stderr": 0.021090847745939306 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5320197044334976, "acc_stderr": 0.035107665979592154, "acc_norm": 0.5320197044334976, "acc_norm_stderr": 0.035107665979592154 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.8, "acc_stderr": 0.04020151261036845, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8242424242424242, "acc_stderr": 0.02972094300622445, "acc_norm": 0.8242424242424242, "acc_norm_stderr": 0.02972094300622445 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.898989898989899, "acc_stderr": 0.02146973557605533, "acc_norm": 0.898989898989899, "acc_norm_stderr": 0.02146973557605533 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9378238341968912, "acc_stderr": 0.017426974154240528, "acc_norm": 0.9378238341968912, "acc_norm_stderr": 0.017426974154240528 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7333333333333333, "acc_stderr": 0.022421273612923714, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.022421273612923714 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7605042016806722, "acc_stderr": 0.02772206549336127, "acc_norm": 0.7605042016806722, "acc_norm_stderr": 0.02772206549336127 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.45695364238410596, "acc_stderr": 0.04067325174247443, "acc_norm": 0.45695364238410596, "acc_norm_stderr": 0.04067325174247443 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8935779816513761, "acc_stderr": 0.013221554674594372, "acc_norm": 0.8935779816513761, "acc_norm_stderr": 0.013221554674594372 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.625, "acc_stderr": 0.033016908987210894, "acc_norm": 0.625, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9313725490196079, "acc_stderr": 0.017744453647073312, "acc_norm": 0.9313725490196079, "acc_norm_stderr": 0.017744453647073312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8860759493670886, "acc_stderr": 0.020681745135884562, "acc_norm": 0.8860759493670886, "acc_norm_stderr": 0.020681745135884562 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7847533632286996, "acc_stderr": 0.027584066602208274, "acc_norm": 0.7847533632286996, "acc_norm_stderr": 0.027584066602208274 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8549618320610687, "acc_stderr": 0.030884661089515375, "acc_norm": 0.8549618320610687, "acc_norm_stderr": 0.030884661089515375 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.03092278832044579, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.03092278832044579 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8425925925925926, "acc_stderr": 0.035207039905179635, "acc_norm": 0.8425925925925926, "acc_norm_stderr": 0.035207039905179635 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.03675668832233188, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.03675668832233188 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8974358974358975, "acc_stderr": 0.01987565502786746, "acc_norm": 0.8974358974358975, "acc_norm_stderr": 0.01987565502786746 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8646232439335888, "acc_stderr": 0.012234384586856488, "acc_norm": 0.8646232439335888, "acc_norm_stderr": 0.012234384586856488 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7832369942196532, "acc_stderr": 0.022183477668412856, "acc_norm": 0.7832369942196532, "acc_norm_stderr": 0.022183477668412856 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5910614525139665, "acc_stderr": 0.016442830654715544, "acc_norm": 0.5910614525139665, "acc_norm_stderr": 0.016442830654715544 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7679738562091504, "acc_stderr": 0.024170840879340873, "acc_norm": 0.7679738562091504, "acc_norm_stderr": 0.024170840879340873 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.77491961414791, "acc_stderr": 0.023720088516179027, "acc_norm": 0.77491961414791, "acc_norm_stderr": 0.023720088516179027 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8364197530864198, "acc_stderr": 0.02058146613825712, "acc_norm": 0.8364197530864198, "acc_norm_stderr": 0.02058146613825712 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5709219858156028, "acc_stderr": 0.029525914302558562, "acc_norm": 0.5709219858156028, "acc_norm_stderr": 0.029525914302558562 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5560625814863103, "acc_stderr": 0.012689708167787679, "acc_norm": 0.5560625814863103, "acc_norm_stderr": 0.012689708167787679 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7536764705882353, "acc_stderr": 0.02617343857052, "acc_norm": 0.7536764705882353, "acc_norm_stderr": 0.02617343857052 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7565359477124183, "acc_stderr": 0.017362473762146613, "acc_norm": 0.7565359477124183, "acc_norm_stderr": 0.017362473762146613 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7454545454545455, "acc_stderr": 0.041723430387053825, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8204081632653061, "acc_stderr": 0.024573293589585637, "acc_norm": 0.8204081632653061, "acc_norm_stderr": 0.024573293589585637 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8905472636815921, "acc_stderr": 0.022076326101824657, "acc_norm": 0.8905472636815921, "acc_norm_stderr": 0.022076326101824657 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8538011695906432, "acc_stderr": 0.027097290118070806, "acc_norm": 0.8538011695906432, "acc_norm_stderr": 0.027097290118070806 }, "harness|truthfulqa:mc|0": { "mc1": 0.40636474908200737, "mc1_stderr": 0.0171938358120939, "mc2": 0.5756052671501329, "mc2_stderr": 0.014559658555893657 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k
[ "region:us" ]
2023-08-17T23:01:20+00:00
{"pretty_name": "Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k", "dataset_summary": "Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-70b-fb16-guanaco-1k](https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-10T00:33:03.607588](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k/blob/main/results_2023-08-10T00%3A33%3A03.607588.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7013441332798022,\n \"acc_stderr\": 0.03091715385865452,\n \"acc_norm\": 0.7054300239648517,\n \"acc_norm_stderr\": 0.030884754243271178,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5756052671501329,\n \"mc2_stderr\": 0.014559658555893657\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382501,\n \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382318\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.686018721370245,\n \"acc_stderr\": 0.004631603539751948,\n \"acc_norm\": 0.8733320055765784,\n \"acc_norm_stderr\": 0.00331920940013512\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947559,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947559\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8354838709677419,\n \"acc_stderr\": 0.021090847745939306,\n \"acc_norm\": 0.8354838709677419,\n \"acc_norm_stderr\": 0.021090847745939306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.898989898989899,\n \"acc_stderr\": 0.02146973557605533,\n \"acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.02146973557605533\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.022421273612923714,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.022421273612923714\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.02772206549336127,\n \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.02772206549336127\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8935779816513761,\n \"acc_stderr\": 0.013221554674594372,\n \"acc_norm\": 0.8935779816513761,\n \"acc_norm_stderr\": 0.013221554674594372\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073312,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n \"acc_stderr\": 0.012234384586856488,\n \"acc_norm\": 0.8646232439335888,\n \"acc_norm_stderr\": 0.012234384586856488\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5910614525139665,\n \"acc_stderr\": 0.016442830654715544,\n \"acc_norm\": 0.5910614525139665,\n \"acc_norm_stderr\": 0.016442830654715544\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.02058146613825712,\n \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.02058146613825712\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5709219858156028,\n \"acc_stderr\": 0.029525914302558562,\n \"acc_norm\": 0.5709219858156028,\n \"acc_norm_stderr\": 0.029525914302558562\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n \"acc_stderr\": 0.012689708167787679,\n \"acc_norm\": 0.5560625814863103,\n \"acc_norm_stderr\": 0.012689708167787679\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146613,\n \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146613\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5756052671501329,\n \"mc2_stderr\": 0.014559658555893657\n }\n}\n```", "repo_url": "https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:33:03.607588.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T00_33_03.607588", "path": ["results_2023-08-10T00:33:03.607588.parquet"]}, {"split": "latest", "path": ["results_2023-08-10T00:33:03.607588.parquet"]}]}]}
2023-08-27T11:26:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model quantumaikr/llama-2-70b-fb16-guanaco-1k on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-10T00:33:03.607588 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/llama-2-70b-fb16-guanaco-1k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-10T00:33:03.607588 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/llama-2-70b-fb16-guanaco-1k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-10T00:33:03.607588 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/llama-2-70b-fb16-guanaco-1k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-10T00:33:03.607588 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
40e901aa77e1f9dd9191ebe469409fb42afdbc94
# Dataset Card for Evaluation run of quantumaikr/QuantumLM-70B-hf ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/quantumaikr/QuantumLM-70B-hf - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [quantumaikr/QuantumLM-70B-hf](https://huggingface.co/quantumaikr/QuantumLM-70B-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_quantumaikr__QuantumLM-70B-hf", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T00:55:40.743446](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM-70B-hf/blob/main/results_2023-09-23T00-55-40.743446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.006291946308724832, "em_stderr": 0.0008097697705635293, "f1": 0.07323301174496649, "f1_stderr": 0.001570960525115463, "acc": 0.4677633614233835, "acc_stderr": 0.010635106183196841 }, "harness|drop|3": { "em": 0.006291946308724832, "em_stderr": 0.0008097697705635293, "f1": 0.07323301174496649, "f1_stderr": 0.001570960525115463 }, "harness|gsm8k|5": { "acc": 0.14783927217589082, "acc_stderr": 0.009776827679143901 }, "harness|winogrande|5": { "acc": 0.7876874506708761, "acc_stderr": 0.011493384687249782 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_quantumaikr__QuantumLM-70B-hf
[ "region:us" ]
2023-08-17T23:01:29+00:00
{"pretty_name": "Evaluation run of quantumaikr/QuantumLM-70B-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [quantumaikr/QuantumLM-70B-hf](https://huggingface.co/quantumaikr/QuantumLM-70B-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__QuantumLM-70B-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T00:55:40.743446](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM-70B-hf/blob/main/results_2023-09-23T00-55-40.743446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006291946308724832,\n \"em_stderr\": 0.0008097697705635293,\n \"f1\": 0.07323301174496649,\n \"f1_stderr\": 0.001570960525115463,\n \"acc\": 0.4677633614233835,\n \"acc_stderr\": 0.010635106183196841\n },\n \"harness|drop|3\": {\n \"em\": 0.006291946308724832,\n \"em_stderr\": 0.0008097697705635293,\n \"f1\": 0.07323301174496649,\n \"f1_stderr\": 0.001570960525115463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14783927217589082,\n \"acc_stderr\": 0.009776827679143901\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249782\n }\n}\n```", "repo_url": "https://huggingface.co/quantumaikr/QuantumLM-70B-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|arc:challenge|25_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T00_55_40.743446", "path": ["**/details_harness|drop|3_2023-09-23T00-55-40.743446.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T00-55-40.743446.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T00_55_40.743446", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-55-40.743446.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-55-40.743446.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hellaswag|10_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T17:33:51.061360.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T17:33:51.061360.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T17:33:51.061360.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T00_55_40.743446", "path": ["**/details_harness|winogrande|5_2023-09-23T00-55-40.743446.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T00-55-40.743446.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T17_33_51.061360", "path": ["results_2023-07-31T17:33:51.061360.parquet"]}, {"split": "2023_09_23T00_55_40.743446", "path": ["results_2023-09-23T00-55-40.743446.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T00-55-40.743446.parquet"]}]}]}
2023-09-22T23:55:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of quantumaikr/QuantumLM-70B-hf ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model quantumaikr/QuantumLM-70B-hf on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T00:55:40.743446(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of quantumaikr/QuantumLM-70B-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/QuantumLM-70B-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:55:40.743446(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of quantumaikr/QuantumLM-70B-hf", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/QuantumLM-70B-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:55:40.743446(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of quantumaikr/QuantumLM-70B-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/QuantumLM-70B-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T00:55:40.743446(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
98e81c739b875c699f190724fd7d5de5ddcb3d1d
# Dataset Card for Evaluation run of TinyPixel/llama2-7b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TinyPixel/llama2-7b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TinyPixel/llama2-7b-instruct](https://huggingface.co/TinyPixel/llama2-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TinyPixel__llama2-7b-instruct", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-17T12:12:37.965756](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__llama2-7b-instruct/blob/main/results_2023-08-17T12%3A12%3A37.965756.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4639503533482998, "acc_stderr": 0.03519400615590806, "acc_norm": 0.467921814589003, "acc_norm_stderr": 0.03517936985393269, "mc1": 0.26438188494492043, "mc1_stderr": 0.015438211119522512, "mc2": 0.39481096196846566, "mc2_stderr": 0.013796205321597201 }, "harness|arc:challenge|25": { "acc": 0.49829351535836175, "acc_stderr": 0.01461130570505699, "acc_norm": 0.5358361774744027, "acc_norm_stderr": 0.01457381366473572 }, "harness|hellaswag|10": { "acc": 0.5910177255526787, "acc_stderr": 0.004906411984476793, "acc_norm": 0.7877912766381199, "acc_norm_stderr": 0.00408036220825117 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.45925925925925926, "acc_stderr": 0.04304979692464242, "acc_norm": 0.45925925925925926, "acc_norm_stderr": 0.04304979692464242 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.39473684210526316, "acc_stderr": 0.039777499346220734, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.039777499346220734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4528301886792453, "acc_stderr": 0.03063562795796182, "acc_norm": 0.4528301886792453, "acc_norm_stderr": 0.03063562795796182 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4513888888888889, "acc_stderr": 0.04161402398403279, "acc_norm": 0.4513888888888889, "acc_norm_stderr": 0.04161402398403279 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092055, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.23529411764705882, "acc_stderr": 0.04220773659171453, "acc_norm": 0.23529411764705882, "acc_norm_stderr": 0.04220773659171453 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4340425531914894, "acc_stderr": 0.03240038086792747, "acc_norm": 0.4340425531914894, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.041857744240220554, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.041857744240220554 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.46206896551724136, "acc_stderr": 0.041546596717075474, "acc_norm": 0.46206896551724136, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2724867724867725, "acc_stderr": 0.022930973071633366, "acc_norm": 0.2724867724867725, "acc_norm_stderr": 0.022930973071633366 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147126, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147126 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.4935483870967742, "acc_stderr": 0.02844163823354051, "acc_norm": 0.4935483870967742, "acc_norm_stderr": 0.02844163823354051 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.32019704433497537, "acc_stderr": 0.032826493853041504, "acc_norm": 0.32019704433497537, "acc_norm_stderr": 0.032826493853041504 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6303030303030303, "acc_stderr": 0.03769430314512566, "acc_norm": 0.6303030303030303, "acc_norm_stderr": 0.03769430314512566 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.48484848484848486, "acc_stderr": 0.03560716516531061, "acc_norm": 0.48484848484848486, "acc_norm_stderr": 0.03560716516531061 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6839378238341969, "acc_stderr": 0.033553973696861736, "acc_norm": 0.6839378238341969, "acc_norm_stderr": 0.033553973696861736 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4282051282051282, "acc_stderr": 0.025088301454694834, "acc_norm": 0.4282051282051282, "acc_norm_stderr": 0.025088301454694834 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2740740740740741, "acc_stderr": 0.027195934804085622, "acc_norm": 0.2740740740740741, "acc_norm_stderr": 0.027195934804085622 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.0322529423239964, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.0322529423239964 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.03734535676787198, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.03734535676787198 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6220183486238532, "acc_stderr": 0.02078918706672811, "acc_norm": 0.6220183486238532, "acc_norm_stderr": 0.02078918706672811 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.24537037037037038, "acc_stderr": 0.029346665094372937, "acc_norm": 0.24537037037037038, "acc_norm_stderr": 0.029346665094372937 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5245098039215687, "acc_stderr": 0.03505093194348798, "acc_norm": 0.5245098039215687, "acc_norm_stderr": 0.03505093194348798 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6033755274261603, "acc_stderr": 0.03184399873811225, "acc_norm": 0.6033755274261603, "acc_norm_stderr": 0.03184399873811225 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5515695067264574, "acc_stderr": 0.033378837362550984, "acc_norm": 0.5515695067264574, "acc_norm_stderr": 0.033378837362550984 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5267175572519084, "acc_stderr": 0.04379024936553894, "acc_norm": 0.5267175572519084, "acc_norm_stderr": 0.04379024936553894 }, "harness|hendrycksTest-international_law|5": { "acc": 0.628099173553719, "acc_stderr": 0.044120158066245044, "acc_norm": 0.628099173553719, "acc_norm_stderr": 0.044120158066245044 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5370370370370371, "acc_stderr": 0.04820403072760628, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.04820403072760628 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.50920245398773, "acc_stderr": 0.03927705600787443, "acc_norm": 0.50920245398773, "acc_norm_stderr": 0.03927705600787443 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.5728155339805825, "acc_stderr": 0.048979577377811674, "acc_norm": 0.5728155339805825, "acc_norm_stderr": 0.048979577377811674 }, "harness|hendrycksTest-marketing|5": { "acc": 0.688034188034188, "acc_stderr": 0.030351527323344937, "acc_norm": 0.688034188034188, "acc_norm_stderr": 0.030351527323344937 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6411238825031929, "acc_stderr": 0.017152991797501342, "acc_norm": 0.6411238825031929, "acc_norm_stderr": 0.017152991797501342 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.49710982658959535, "acc_stderr": 0.026918645383239015, "acc_norm": 0.49710982658959535, "acc_norm_stderr": 0.026918645383239015 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.49673202614379086, "acc_stderr": 0.028629305194003543, "acc_norm": 0.49673202614379086, "acc_norm_stderr": 0.028629305194003543 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6045016077170418, "acc_stderr": 0.027770918531427838, "acc_norm": 0.6045016077170418, "acc_norm_stderr": 0.027770918531427838 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5092592592592593, "acc_stderr": 0.027815973433878014, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.027815973433878014 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36524822695035464, "acc_stderr": 0.028723863853281278, "acc_norm": 0.36524822695035464, "acc_norm_stderr": 0.028723863853281278 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.36897001303780963, "acc_stderr": 0.01232393665017486, "acc_norm": 0.36897001303780963, "acc_norm_stderr": 0.01232393665017486 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5073529411764706, "acc_stderr": 0.030369552523902173, "acc_norm": 0.5073529411764706, "acc_norm_stderr": 0.030369552523902173 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.44281045751633985, "acc_stderr": 0.020095083154577344, "acc_norm": 0.44281045751633985, "acc_norm_stderr": 0.020095083154577344 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.04782001791380061, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.04782001791380061 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.46122448979591835, "acc_stderr": 0.03191282052669277, "acc_norm": 0.46122448979591835, "acc_norm_stderr": 0.03191282052669277 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6218905472636815, "acc_stderr": 0.034288678487786564, "acc_norm": 0.6218905472636815, "acc_norm_stderr": 0.034288678487786564 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-virology|5": { "acc": 0.39156626506024095, "acc_stderr": 0.03799857454479637, "acc_norm": 0.39156626506024095, "acc_norm_stderr": 0.03799857454479637 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.034462962170884265, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.26438188494492043, "mc1_stderr": 0.015438211119522512, "mc2": 0.39481096196846566, "mc2_stderr": 0.013796205321597201 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TinyPixel__llama2-7b-instruct
[ "region:us" ]
2023-08-17T23:01:38+00:00
{"pretty_name": "Evaluation run of TinyPixel/llama2-7b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [TinyPixel/llama2-7b-instruct](https://huggingface.co/TinyPixel/llama2-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__llama2-7b-instruct\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-17T12:12:37.965756](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__llama2-7b-instruct/blob/main/results_2023-08-17T12%3A12%3A37.965756.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4639503533482998,\n \"acc_stderr\": 0.03519400615590806,\n \"acc_norm\": 0.467921814589003,\n \"acc_norm_stderr\": 0.03517936985393269,\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.39481096196846566,\n \"mc2_stderr\": 0.013796205321597201\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49829351535836175,\n \"acc_stderr\": 0.01461130570505699,\n \"acc_norm\": 0.5358361774744027,\n \"acc_norm_stderr\": 0.01457381366473572\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5910177255526787,\n \"acc_stderr\": 0.004906411984476793,\n \"acc_norm\": 0.7877912766381199,\n \"acc_norm_stderr\": 0.00408036220825117\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.03063562795796182,\n \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.03063562795796182\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633366,\n \"acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633366\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.4935483870967742,\n \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03560716516531061,\n \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03560716516531061\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694834,\n \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694834\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6220183486238532,\n \"acc_stderr\": 0.02078918706672811,\n \"acc_norm\": 0.6220183486238532,\n \"acc_norm_stderr\": 0.02078918706672811\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24537037037037038,\n \"acc_stderr\": 0.029346665094372937,\n \"acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.029346665094372937\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5245098039215687,\n \"acc_stderr\": 0.03505093194348798,\n \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.03505093194348798\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811225,\n \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811225\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n \"acc_stderr\": 0.033378837362550984,\n \"acc_norm\": 0.5515695067264574,\n \"acc_norm_stderr\": 0.033378837362550984\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.048979577377811674,\n \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.048979577377811674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n \"acc_stderr\": 0.030351527323344937,\n \"acc_norm\": 0.688034188034188,\n \"acc_norm_stderr\": 0.030351527323344937\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6411238825031929,\n \"acc_stderr\": 0.017152991797501342,\n \"acc_norm\": 0.6411238825031929,\n \"acc_norm_stderr\": 0.017152991797501342\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.026918645383239015,\n \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.026918645383239015\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n \"acc_stderr\": 0.027770918531427838,\n \"acc_norm\": 0.6045016077170418,\n \"acc_norm_stderr\": 0.027770918531427838\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36897001303780963,\n \"acc_stderr\": 0.01232393665017486,\n \"acc_norm\": 0.36897001303780963,\n \"acc_norm_stderr\": 0.01232393665017486\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.44281045751633985,\n \"acc_stderr\": 0.020095083154577344,\n \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.020095083154577344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.03191282052669277,\n \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.03191282052669277\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.39481096196846566,\n \"mc2_stderr\": 0.013796205321597201\n }\n}\n```", "repo_url": "https://huggingface.co/TinyPixel/llama2-7b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|arc:challenge|25_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hellaswag|10_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T12:12:37.965756.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T12:12:37.965756.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T12_12_37.965756", "path": ["results_2023-08-17T12:12:37.965756.parquet"]}, {"split": "latest", "path": ["results_2023-08-17T12:12:37.965756.parquet"]}]}]}
2023-08-27T11:26:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TinyPixel/llama2-7b-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TinyPixel/llama2-7b-instruct on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-17T12:12:37.965756 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TinyPixel/llama2-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/llama2-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T12:12:37.965756 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TinyPixel/llama2-7b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/llama2-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-17T12:12:37.965756 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TinyPixel/llama2-7b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/llama2-7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-17T12:12:37.965756 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ed0e92c1356db0fe325e0858292211c04d53d433
# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct](https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T04:21:29.440361](https://huggingface.co/datasets/open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct/blob/main/results_2023-10-15T04-21-29.440361.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.1923238255033557, "em_stderr": 0.0040362200154763495, "f1": 0.23113255033557045, "f1_stderr": 0.0040754338170676495, "acc": 0.27782162588792425, "acc_stderr": 0.006982598384541777 }, "harness|drop|3": { "em": 0.1923238255033557, "em_stderr": 0.0040362200154763495, "f1": 0.23113255033557045, "f1_stderr": 0.0040754338170676495 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5556432517758485, "acc_stderr": 0.013965196769083555 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct
[ "region:us" ]
2023-08-17T23:01:47+00:00
{"pretty_name": "Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct](https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T04:21:29.440361](https://huggingface.co/datasets/open-llm-leaderboard/details_GeorgiaTechResearchInstitute__starcoder-gpteacher-code-instruct/blob/main/results_2023-10-15T04-21-29.440361.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1923238255033557,\n \"em_stderr\": 0.0040362200154763495,\n \"f1\": 0.23113255033557045,\n \"f1_stderr\": 0.0040754338170676495,\n \"acc\": 0.27782162588792425,\n \"acc_stderr\": 0.006982598384541777\n },\n \"harness|drop|3\": {\n \"em\": 0.1923238255033557,\n \"em_stderr\": 0.0040362200154763495,\n \"f1\": 0.23113255033557045,\n \"f1_stderr\": 0.0040754338170676495\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5556432517758485,\n \"acc_stderr\": 0.013965196769083555\n }\n}\n```", "repo_url": "https://huggingface.co/GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T04_21_29.440361", "path": ["**/details_harness|drop|3_2023-10-15T04-21-29.440361.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T04-21-29.440361.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T04_21_29.440361", "path": ["**/details_harness|gsm8k|5_2023-10-15T04-21-29.440361.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T04-21-29.440361.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:31:16.803242.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:31:16.803242.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T04_21_29.440361", "path": ["**/details_harness|winogrande|5_2023-10-15T04-21-29.440361.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T04-21-29.440361.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T20_31_16.803242", "path": ["results_2023-07-19T20:31:16.803242.parquet"]}, {"split": "2023_10_15T04_21_29.440361", "path": ["results_2023-10-15T04-21-29.440361.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T04-21-29.440361.parquet"]}]}]}
2023-10-15T03:21:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T04:21:29.440361(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T04:21:29.440361(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T04:21:29.440361(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/starcoder-gpteacher-code-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T04:21:29.440361(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
73382c53b2bb7cbb6398f3e2704b19c27c7bc184
# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/galpaca-30b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/GeorgiaTechResearchInstitute/galpaca-30b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [GeorgiaTechResearchInstitute/galpaca-30b](https://huggingface.co/GeorgiaTechResearchInstitute/galpaca-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_GeorgiaTechResearchInstitute__galpaca-30b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T01:05:49.975308](https://huggingface.co/datasets/open-llm-leaderboard/details_GeorgiaTechResearchInstitute__galpaca-30b/blob/main/results_2023-10-15T01-05-49.975308.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.205746644295302, "em_stderr": 0.00413985910625819, "f1": 0.288852768456377, "f1_stderr": 0.004183984738478157, "acc": 0.3265751062277813, "acc_stderr": 0.009076887028812184 }, "harness|drop|3": { "em": 0.205746644295302, "em_stderr": 0.00413985910625819, "f1": 0.288852768456377, "f1_stderr": 0.004183984738478157 }, "harness|gsm8k|5": { "acc": 0.028051554207733132, "acc_stderr": 0.00454822953383636 }, "harness|winogrande|5": { "acc": 0.6250986582478295, "acc_stderr": 0.013605544523788008 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_GeorgiaTechResearchInstitute__galpaca-30b
[ "region:us" ]
2023-08-17T23:01:56+00:00
{"pretty_name": "Evaluation run of GeorgiaTechResearchInstitute/galpaca-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [GeorgiaTechResearchInstitute/galpaca-30b](https://huggingface.co/GeorgiaTechResearchInstitute/galpaca-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GeorgiaTechResearchInstitute__galpaca-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T01:05:49.975308](https://huggingface.co/datasets/open-llm-leaderboard/details_GeorgiaTechResearchInstitute__galpaca-30b/blob/main/results_2023-10-15T01-05-49.975308.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.205746644295302,\n \"em_stderr\": 0.00413985910625819,\n \"f1\": 0.288852768456377,\n \"f1_stderr\": 0.004183984738478157,\n \"acc\": 0.3265751062277813,\n \"acc_stderr\": 0.009076887028812184\n },\n \"harness|drop|3\": {\n \"em\": 0.205746644295302,\n \"em_stderr\": 0.00413985910625819,\n \"f1\": 0.288852768456377,\n \"f1_stderr\": 0.004183984738478157\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \"acc_stderr\": 0.00454822953383636\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6250986582478295,\n \"acc_stderr\": 0.013605544523788008\n }\n}\n```", "repo_url": "https://huggingface.co/GeorgiaTechResearchInstitute/galpaca-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|arc:challenge|25_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T10_10_35.995390", "path": ["**/details_harness|drop|3_2023-09-23T10-10-35.995390.parquet"]}, {"split": "2023_10_15T01_05_49.975308", "path": ["**/details_harness|drop|3_2023-10-15T01-05-49.975308.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T01-05-49.975308.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T10_10_35.995390", "path": ["**/details_harness|gsm8k|5_2023-09-23T10-10-35.995390.parquet"]}, {"split": "2023_10_15T01_05_49.975308", "path": ["**/details_harness|gsm8k|5_2023-10-15T01-05-49.975308.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T01-05-49.975308.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hellaswag|10_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:01:48.453969.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T12:18:52.169485.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T12:18:52.169485.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T12:18:52.169485.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T10_10_35.995390", "path": ["**/details_harness|winogrande|5_2023-09-23T10-10-35.995390.parquet"]}, {"split": "2023_10_15T01_05_49.975308", "path": ["**/details_harness|winogrande|5_2023-10-15T01-05-49.975308.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T01-05-49.975308.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_01_48.453969", "path": ["results_2023-07-19T22:01:48.453969.parquet"]}, {"split": "2023_08_09T12_18_52.169485", "path": ["results_2023-08-09T12:18:52.169485.parquet"]}, {"split": "2023_09_23T10_10_35.995390", "path": ["results_2023-09-23T10-10-35.995390.parquet"]}, {"split": "2023_10_15T01_05_49.975308", "path": ["results_2023-10-15T01-05-49.975308.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T01-05-49.975308.parquet"]}]}]}
2023-10-15T00:05:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/galpaca-30b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/galpaca-30b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T01:05:49.975308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/galpaca-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/galpaca-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T01:05:49.975308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/galpaca-30b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/galpaca-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T01:05:49.975308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of GeorgiaTechResearchInstitute/galpaca-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model GeorgiaTechResearchInstitute/galpaca-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T01:05:49.975308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
71d508f0ef0c0c3338c79b77427497c11852562b
# Dataset Card for Evaluation run of kfkas/Llama-2-ko-7b-Chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/kfkas/Llama-2-ko-7b-Chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [kfkas/Llama-2-ko-7b-Chat](https://huggingface.co/kfkas/Llama-2-ko-7b-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kfkas__Llama-2-ko-7b-Chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T06:20:53.119467](https://huggingface.co/datasets/open-llm-leaderboard/details_kfkas__Llama-2-ko-7b-Chat/blob/main/results_2023-09-18T06-20-53.119467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.029886744966442953, "em_stderr": 0.0017437739254467523, "f1": 0.11206061241610675, "f1_stderr": 0.002589360675643281, "acc": 0.3406984196130502, "acc_stderr": 0.008168649232732146 }, "harness|drop|3": { "em": 0.029886744966442953, "em_stderr": 0.0017437739254467523, "f1": 0.11206061241610675, "f1_stderr": 0.002589360675643281 }, "harness|gsm8k|5": { "acc": 0.01288855193328279, "acc_stderr": 0.003106901266499642 }, "harness|winogrande|5": { "acc": 0.6685082872928176, "acc_stderr": 0.01323039719896465 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_kfkas__Llama-2-ko-7b-Chat
[ "region:us" ]
2023-08-17T23:02:13+00:00
{"pretty_name": "Evaluation run of kfkas/Llama-2-ko-7b-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [kfkas/Llama-2-ko-7b-Chat](https://huggingface.co/kfkas/Llama-2-ko-7b-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kfkas__Llama-2-ko-7b-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T06:20:53.119467](https://huggingface.co/datasets/open-llm-leaderboard/details_kfkas__Llama-2-ko-7b-Chat/blob/main/results_2023-09-18T06-20-53.119467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029886744966442953,\n \"em_stderr\": 0.0017437739254467523,\n \"f1\": 0.11206061241610675,\n \"f1_stderr\": 0.002589360675643281,\n \"acc\": 0.3406984196130502,\n \"acc_stderr\": 0.008168649232732146\n },\n \"harness|drop|3\": {\n \"em\": 0.029886744966442953,\n \"em_stderr\": 0.0017437739254467523,\n \"f1\": 0.11206061241610675,\n \"f1_stderr\": 0.002589360675643281\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \"acc_stderr\": 0.003106901266499642\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.01323039719896465\n }\n}\n```", "repo_url": "https://huggingface.co/kfkas/Llama-2-ko-7b-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|arc:challenge|25_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|arc:challenge|25_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T05_11_56.274160", "path": ["**/details_harness|drop|3_2023-09-17T05-11-56.274160.parquet"]}, {"split": "2023_09_18T06_20_53.119467", "path": ["**/details_harness|drop|3_2023-09-18T06-20-53.119467.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T06-20-53.119467.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T05_11_56.274160", "path": ["**/details_harness|gsm8k|5_2023-09-17T05-11-56.274160.parquet"]}, {"split": "2023_09_18T06_20_53.119467", "path": ["**/details_harness|gsm8k|5_2023-09-18T06-20-53.119467.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T06-20-53.119467.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hellaswag|10_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hellaswag|10_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T10:54:54.901743.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T16:15:02.960730.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T16:15:02.960730.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T16:15:02.960730.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T05_11_56.274160", "path": ["**/details_harness|winogrande|5_2023-09-17T05-11-56.274160.parquet"]}, {"split": "2023_09_18T06_20_53.119467", "path": ["**/details_harness|winogrande|5_2023-09-18T06-20-53.119467.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T06-20-53.119467.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_27T10_54_54.901743", "path": ["results_2023-07-27T10:54:54.901743.parquet"]}, {"split": "2023_07_27T16_15_02.960730", "path": ["results_2023-07-27T16:15:02.960730.parquet"]}, {"split": "2023_09_17T05_11_56.274160", "path": ["results_2023-09-17T05-11-56.274160.parquet"]}, {"split": "2023_09_18T06_20_53.119467", "path": ["results_2023-09-18T06-20-53.119467.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T06-20-53.119467.parquet"]}]}]}
2023-09-18T05:21:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kfkas/Llama-2-ko-7b-Chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model kfkas/Llama-2-ko-7b-Chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T06:20:53.119467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of kfkas/Llama-2-ko-7b-Chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model kfkas/Llama-2-ko-7b-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T06:20:53.119467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kfkas/Llama-2-ko-7b-Chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model kfkas/Llama-2-ko-7b-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T06:20:53.119467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kfkas/Llama-2-ko-7b-Chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kfkas/Llama-2-ko-7b-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T06:20:53.119467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f4601e715c69fa19bbd94c5272eb659afb9ed578
# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Role-Playing-Data ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/camel-ai/CAMEL-13B-Role-Playing-Data - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [camel-ai/CAMEL-13B-Role-Playing-Data](https://huggingface.co/camel-ai/CAMEL-13B-Role-Playing-Data) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_camel-ai__CAMEL-13B-Role-Playing-Data", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-25T02:33:54.730423](https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-13B-Role-Playing-Data/blob/main/results_2023-10-25T02-33-54.730423.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004404362416107382, "em_stderr": 0.000678145162047963, "f1": 0.06661703020134248, "f1_stderr": 0.001491591221438747, "acc": 0.4069360263718957, "acc_stderr": 0.009756268229958965 }, "harness|drop|3": { "em": 0.004404362416107382, "em_stderr": 0.000678145162047963, "f1": 0.06661703020134248, "f1_stderr": 0.001491591221438747 }, "harness|gsm8k|5": { "acc": 0.07354056103108415, "acc_stderr": 0.007189835754365264 }, "harness|winogrande|5": { "acc": 0.7403314917127072, "acc_stderr": 0.012322700705552667 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_camel-ai__CAMEL-13B-Role-Playing-Data
[ "region:us" ]
2023-08-17T23:02:32+00:00
{"pretty_name": "Evaluation run of camel-ai/CAMEL-13B-Role-Playing-Data", "dataset_summary": "Dataset automatically created during the evaluation run of model [camel-ai/CAMEL-13B-Role-Playing-Data](https://huggingface.co/camel-ai/CAMEL-13B-Role-Playing-Data) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_camel-ai__CAMEL-13B-Role-Playing-Data\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T02:33:54.730423](https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-13B-Role-Playing-Data/blob/main/results_2023-10-25T02-33-54.730423.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.000678145162047963,\n \"f1\": 0.06661703020134248,\n \"f1_stderr\": 0.001491591221438747,\n \"acc\": 0.4069360263718957,\n \"acc_stderr\": 0.009756268229958965\n },\n \"harness|drop|3\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.000678145162047963,\n \"f1\": 0.06661703020134248,\n \"f1_stderr\": 0.001491591221438747\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07354056103108415,\n \"acc_stderr\": 0.007189835754365264\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n }\n}\n```", "repo_url": "https://huggingface.co/camel-ai/CAMEL-13B-Role-Playing-Data", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T02_33_54.730423", "path": ["**/details_harness|drop|3_2023-10-25T02-33-54.730423.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T02-33-54.730423.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T02_33_54.730423", "path": ["**/details_harness|gsm8k|5_2023-10-25T02-33-54.730423.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T02-33-54.730423.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:40:55.376784.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:40:55.376784.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:40:55.376784.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T02_33_54.730423", "path": ["**/details_harness|winogrande|5_2023-10-25T02-33-54.730423.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T02-33-54.730423.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_40_55.376784", "path": ["results_2023-07-19T18:40:55.376784.parquet"]}, {"split": "2023_10_25T02_33_54.730423", "path": ["results_2023-10-25T02-33-54.730423.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T02-33-54.730423.parquet"]}]}]}
2023-10-25T01:34:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Role-Playing-Data ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Role-Playing-Data on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-25T02:33:54.730423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Role-Playing-Data", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Role-Playing-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T02:33:54.730423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Role-Playing-Data", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Role-Playing-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-25T02:33:54.730423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Role-Playing-Data## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Role-Playing-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T02:33:54.730423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0484e9573ebfedd80ba7452a72e659b975d55c8d
# Dataset Card for Evaluation run of camel-ai/CAMEL-33B-Combined-Data ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/camel-ai/CAMEL-33B-Combined-Data - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [camel-ai/CAMEL-33B-Combined-Data](https://huggingface.co/camel-ai/CAMEL-33B-Combined-Data) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_camel-ai__CAMEL-33B-Combined-Data", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T14:06:04.717229](https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-33B-Combined-Data/blob/main/results_2023-09-17T14-06-04.717229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004404362416107382, "em_stderr": 0.0006781451620479537, "f1": 0.07118393456375847, "f1_stderr": 0.001525704115056517, "acc": 0.4619838879637237, "acc_stderr": 0.010586283529726756 }, "harness|drop|3": { "em": 0.004404362416107382, "em_stderr": 0.0006781451620479537, "f1": 0.07118393456375847, "f1_stderr": 0.001525704115056517 }, "harness|gsm8k|5": { "acc": 0.14101592115238817, "acc_stderr": 0.009586695349244102 }, "harness|winogrande|5": { "acc": 0.7829518547750592, "acc_stderr": 0.01158587171020941 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_camel-ai__CAMEL-33B-Combined-Data
[ "region:us" ]
2023-08-17T23:02:41+00:00
{"pretty_name": "Evaluation run of camel-ai/CAMEL-33B-Combined-Data", "dataset_summary": "Dataset automatically created during the evaluation run of model [camel-ai/CAMEL-33B-Combined-Data](https://huggingface.co/camel-ai/CAMEL-33B-Combined-Data) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_camel-ai__CAMEL-33B-Combined-Data\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T14:06:04.717229](https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-33B-Combined-Data/blob/main/results_2023-09-17T14-06-04.717229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479537,\n \"f1\": 0.07118393456375847,\n \"f1_stderr\": 0.001525704115056517,\n \"acc\": 0.4619838879637237,\n \"acc_stderr\": 0.010586283529726756\n },\n \"harness|drop|3\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479537,\n \"f1\": 0.07118393456375847,\n \"f1_stderr\": 0.001525704115056517\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14101592115238817,\n \"acc_stderr\": 0.009586695349244102\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.01158587171020941\n }\n}\n```", "repo_url": "https://huggingface.co/camel-ai/CAMEL-33B-Combined-Data", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T14_06_04.717229", "path": ["**/details_harness|drop|3_2023-09-17T14-06-04.717229.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T14-06-04.717229.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T14_06_04.717229", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-06-04.717229.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-06-04.717229.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:41:43.051311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:41:43.051311.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:41:43.051311.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T14_06_04.717229", "path": ["**/details_harness|winogrande|5_2023-09-17T14-06-04.717229.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T14-06-04.717229.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T13_41_43.051311", "path": ["results_2023-08-01T13:41:43.051311.parquet"]}, {"split": "2023_09_17T14_06_04.717229", "path": ["results_2023-09-17T14-06-04.717229.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T14-06-04.717229.parquet"]}]}]}
2023-09-17T13:06:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of camel-ai/CAMEL-33B-Combined-Data ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model camel-ai/CAMEL-33B-Combined-Data on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T14:06:04.717229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of camel-ai/CAMEL-33B-Combined-Data", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-33B-Combined-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T14:06:04.717229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of camel-ai/CAMEL-33B-Combined-Data", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-33B-Combined-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T14:06:04.717229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of camel-ai/CAMEL-33B-Combined-Data## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-33B-Combined-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T14:06:04.717229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6c63c8cc49774d7468b821bbf7f1222d1b85a9b0
# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Combined-Data ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/camel-ai/CAMEL-13B-Combined-Data - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [camel-ai/CAMEL-13B-Combined-Data](https://huggingface.co/camel-ai/CAMEL-13B-Combined-Data) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_camel-ai__CAMEL-13B-Combined-Data", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T12:27:31.812773](https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-13B-Combined-Data/blob/main/results_2023-09-23T12-27-31.812773.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01604446308724832, "em_stderr": 0.0012867375725646064, "f1": 0.07856963087248349, "f1_stderr": 0.0018370090964164025, "acc": 0.4129021950450372, "acc_stderr": 0.009590867532569065 }, "harness|drop|3": { "em": 0.01604446308724832, "em_stderr": 0.0012867375725646064, "f1": 0.07856963087248349, "f1_stderr": 0.0018370090964164025 }, "harness|gsm8k|5": { "acc": 0.0712661106899166, "acc_stderr": 0.0070864621279544925 }, "harness|winogrande|5": { "acc": 0.7545382794001578, "acc_stderr": 0.012095272937183639 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_camel-ai__CAMEL-13B-Combined-Data
[ "region:us" ]
2023-08-17T23:02:49+00:00
{"pretty_name": "Evaluation run of camel-ai/CAMEL-13B-Combined-Data", "dataset_summary": "Dataset automatically created during the evaluation run of model [camel-ai/CAMEL-13B-Combined-Data](https://huggingface.co/camel-ai/CAMEL-13B-Combined-Data) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_camel-ai__CAMEL-13B-Combined-Data\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T12:27:31.812773](https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-13B-Combined-Data/blob/main/results_2023-09-23T12-27-31.812773.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01604446308724832,\n \"em_stderr\": 0.0012867375725646064,\n \"f1\": 0.07856963087248349,\n \"f1_stderr\": 0.0018370090964164025,\n \"acc\": 0.4129021950450372,\n \"acc_stderr\": 0.009590867532569065\n },\n \"harness|drop|3\": {\n \"em\": 0.01604446308724832,\n \"em_stderr\": 0.0012867375725646064,\n \"f1\": 0.07856963087248349,\n \"f1_stderr\": 0.0018370090964164025\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.0070864621279544925\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183639\n }\n}\n```", "repo_url": "https://huggingface.co/camel-ai/CAMEL-13B-Combined-Data", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T12_27_31.812773", "path": ["**/details_harness|drop|3_2023-09-23T12-27-31.812773.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T12-27-31.812773.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T12_27_31.812773", "path": ["**/details_harness|gsm8k|5_2023-09-23T12-27-31.812773.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T12-27-31.812773.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:34:56.119658.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:34:56.119658.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:34:56.119658.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T12_27_31.812773", "path": ["**/details_harness|winogrande|5_2023-09-23T12-27-31.812773.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T12-27-31.812773.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_34_56.119658", "path": ["results_2023-07-19T18:34:56.119658.parquet"]}, {"split": "2023_09_23T12_27_31.812773", "path": ["results_2023-09-23T12-27-31.812773.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T12-27-31.812773.parquet"]}]}]}
2023-09-23T11:27:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Combined-Data ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Combined-Data on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T12:27:31.812773(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Combined-Data", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Combined-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T12:27:31.812773(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Combined-Data", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Combined-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T12:27:31.812773(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of camel-ai/CAMEL-13B-Combined-Data## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model camel-ai/CAMEL-13B-Combined-Data on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T12:27:31.812773(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cbb85670e94040c54136a1fd3847a292acb1b0ae
# Dataset Card for Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b](https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T20:03:30.331669](https://huggingface.co/datasets/open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b/blob/main/results_2023-09-22T20-03-30.331669.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0019924496644295304, "em_stderr": 0.000456667646266702, "f1": 0.05642302852349, "f1_stderr": 0.0012977737732540458, "acc": 0.41872834230806744, "acc_stderr": 0.009633077195432445 }, "harness|drop|3": { "em": 0.0019924496644295304, "em_stderr": 0.000456667646266702, "f1": 0.05642302852349, "f1_stderr": 0.0012977737732540458 }, "harness|gsm8k|5": { "acc": 0.0758150113722517, "acc_stderr": 0.0072912057231625796 }, "harness|winogrande|5": { "acc": 0.7616416732438832, "acc_stderr": 0.011974948667702311 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b
[ "region:us" ]
2023-08-17T23:02:58+00:00
{"pretty_name": "Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b](https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T20:03:30.331669](https://huggingface.co/datasets/open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b/blob/main/results_2023-09-22T20-03-30.331669.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.000456667646266702,\n \"f1\": 0.05642302852349,\n \"f1_stderr\": 0.0012977737732540458,\n \"acc\": 0.41872834230806744,\n \"acc_stderr\": 0.009633077195432445\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.000456667646266702,\n \"f1\": 0.05642302852349,\n \"f1_stderr\": 0.0012977737732540458\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \"acc_stderr\": 0.0072912057231625796\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702311\n }\n}\n```", "repo_url": "https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T20_03_30.331669", "path": ["**/details_harness|drop|3_2023-09-22T20-03-30.331669.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T20-03-30.331669.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T20_03_30.331669", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-03-30.331669.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-03-30.331669.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:57:53.688517.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:57:53.688517.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T20_03_30.331669", "path": ["**/details_harness|winogrande|5_2023-09-22T20-03-30.331669.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T20-03-30.331669.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T17_57_53.688517", "path": ["results_2023-08-09T17:57:53.688517.parquet"]}, {"split": "2023_09_22T20_03_30.331669", "path": ["results_2023-09-22T20-03-30.331669.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T20-03-30.331669.parquet"]}]}]}
2023-09-22T19:03:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T20:03:30.331669(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T20:03:30.331669(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T20:03:30.331669(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 32, 31, 180, 68, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T20:03:30.331669(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f93246ceaebb9d4d915e8d0311a540bd1fa2b60f
# Dataset Card for Evaluation run of jordiclive/Llama-2-70b-oasst-1-200 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jordiclive/Llama-2-70b-oasst-1-200 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jordiclive/Llama-2-70b-oasst-1-200](https://huggingface.co/jordiclive/Llama-2-70b-oasst-1-200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jordiclive__Llama-2-70b-oasst-1-200", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T01:27:48.135864](https://huggingface.co/datasets/open-llm-leaderboard/details_jordiclive__Llama-2-70b-oasst-1-200/blob/main/results_2023-09-17T01-27-48.135864.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788269036, "f1": 0.06728817114093961, "f1_stderr": 0.0013731781553029802, "acc": 0.5844391933091307, "acc_stderr": 0.011597519226727348 }, "harness|drop|3": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788269036, "f1": 0.06728817114093961, "f1_stderr": 0.0013731781553029802 }, "harness|gsm8k|5": { "acc": 0.32752084912812734, "acc_stderr": 0.012927102210426472 }, "harness|winogrande|5": { "acc": 0.8413575374901342, "acc_stderr": 0.010267936243028223 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jordiclive__Llama-2-70b-oasst-1-200
[ "region:us" ]
2023-08-17T23:03:07+00:00
{"pretty_name": "Evaluation run of jordiclive/Llama-2-70b-oasst-1-200", "dataset_summary": "Dataset automatically created during the evaluation run of model [jordiclive/Llama-2-70b-oasst-1-200](https://huggingface.co/jordiclive/Llama-2-70b-oasst-1-200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jordiclive__Llama-2-70b-oasst-1-200\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T01:27:48.135864](https://huggingface.co/datasets/open-llm-leaderboard/details_jordiclive__Llama-2-70b-oasst-1-200/blob/main/results_2023-09-17T01-27-48.135864.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788269036,\n \"f1\": 0.06728817114093961,\n \"f1_stderr\": 0.0013731781553029802,\n \"acc\": 0.5844391933091307,\n \"acc_stderr\": 0.011597519226727348\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788269036,\n \"f1\": 0.06728817114093961,\n \"f1_stderr\": 0.0013731781553029802\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32752084912812734,\n \"acc_stderr\": 0.012927102210426472\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028223\n }\n}\n```", "repo_url": "https://huggingface.co/jordiclive/Llama-2-70b-oasst-1-200", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T01_27_48.135864", "path": ["**/details_harness|drop|3_2023-09-17T01-27-48.135864.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T01-27-48.135864.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T01_27_48.135864", "path": ["**/details_harness|gsm8k|5_2023-09-17T01-27-48.135864.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T01-27-48.135864.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:36:08.720698.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:36:08.720698.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:36:08.720698.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T01_27_48.135864", "path": ["**/details_harness|winogrande|5_2023-09-17T01-27-48.135864.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T01-27-48.135864.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T00_36_08.720698", "path": ["results_2023-08-10T00:36:08.720698.parquet"]}, {"split": "2023_09_17T01_27_48.135864", "path": ["results_2023-09-17T01-27-48.135864.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T01-27-48.135864.parquet"]}]}]}
2023-09-17T00:28:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jordiclive/Llama-2-70b-oasst-1-200 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jordiclive/Llama-2-70b-oasst-1-200 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T01:27:48.135864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jordiclive/Llama-2-70b-oasst-1-200", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jordiclive/Llama-2-70b-oasst-1-200 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T01:27:48.135864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jordiclive/Llama-2-70b-oasst-1-200", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jordiclive/Llama-2-70b-oasst-1-200 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T01:27:48.135864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jordiclive/Llama-2-70b-oasst-1-200## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jordiclive/Llama-2-70b-oasst-1-200 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T01:27:48.135864(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6337a87b02c8e799602b5cff30da7a5d3d695d7c
# Dataset Card for Evaluation run of jlevin/guanaco-unchained-llama-2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jlevin/guanaco-unchained-llama-2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jlevin/guanaco-unchained-llama-2-7b](https://huggingface.co/jlevin/guanaco-unchained-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jlevin__guanaco-unchained-llama-2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T03:51:33.627576](https://huggingface.co/datasets/open-llm-leaderboard/details_jlevin__guanaco-unchained-llama-2-7b/blob/main/results_2023-09-23T03-51-33.627576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.00388003355704698, "em_stderr": 0.0006366682825519847, "f1": 0.060006291946308764, "f1_stderr": 0.0014772658686472916, "acc": 0.3394735314656232, "acc_stderr": 0.009225130040171274 }, "harness|drop|3": { "em": 0.00388003355704698, "em_stderr": 0.0006366682825519847, "f1": 0.060006291946308764, "f1_stderr": 0.0014772658686472916 }, "harness|gsm8k|5": { "acc": 0.03411675511751327, "acc_stderr": 0.00500021260077329 }, "harness|winogrande|5": { "acc": 0.6448303078137332, "acc_stderr": 0.013450047479569257 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jlevin__guanaco-unchained-llama-2-7b
[ "region:us" ]
2023-08-17T23:03:16+00:00
{"pretty_name": "Evaluation run of jlevin/guanaco-unchained-llama-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jlevin/guanaco-unchained-llama-2-7b](https://huggingface.co/jlevin/guanaco-unchained-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jlevin__guanaco-unchained-llama-2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T03:51:33.627576](https://huggingface.co/datasets/open-llm-leaderboard/details_jlevin__guanaco-unchained-llama-2-7b/blob/main/results_2023-09-23T03-51-33.627576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00388003355704698,\n \"em_stderr\": 0.0006366682825519847,\n \"f1\": 0.060006291946308764,\n \"f1_stderr\": 0.0014772658686472916,\n \"acc\": 0.3394735314656232,\n \"acc_stderr\": 0.009225130040171274\n },\n \"harness|drop|3\": {\n \"em\": 0.00388003355704698,\n \"em_stderr\": 0.0006366682825519847,\n \"f1\": 0.060006291946308764,\n \"f1_stderr\": 0.0014772658686472916\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \"acc_stderr\": 0.00500021260077329\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6448303078137332,\n \"acc_stderr\": 0.013450047479569257\n }\n}\n```", "repo_url": "https://huggingface.co/jlevin/guanaco-unchained-llama-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|arc:challenge|25_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T18_52_14.422934", "path": ["**/details_harness|drop|3_2023-09-17T18-52-14.422934.parquet"]}, {"split": "2023_09_23T03_51_33.627576", "path": ["**/details_harness|drop|3_2023-09-23T03-51-33.627576.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T03-51-33.627576.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T18_52_14.422934", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-52-14.422934.parquet"]}, {"split": "2023_09_23T03_51_33.627576", "path": ["**/details_harness|gsm8k|5_2023-09-23T03-51-33.627576.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T03-51-33.627576.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hellaswag|10_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T16:49:06.060712.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:41:35.699742.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:41:35.699742.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:41:35.699742.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T18_52_14.422934", "path": ["**/details_harness|winogrande|5_2023-09-17T18-52-14.422934.parquet"]}, {"split": "2023_09_23T03_51_33.627576", "path": ["**/details_harness|winogrande|5_2023-09-23T03-51-33.627576.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T03-51-33.627576.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T16_49_06.060712", "path": ["results_2023-08-09T16:49:06.060712.parquet"]}, {"split": "2023_08_09T20_41_35.699742", "path": ["results_2023-08-09T20:41:35.699742.parquet"]}, {"split": "2023_09_17T18_52_14.422934", "path": ["results_2023-09-17T18-52-14.422934.parquet"]}, {"split": "2023_09_23T03_51_33.627576", "path": ["results_2023-09-23T03-51-33.627576.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T03-51-33.627576.parquet"]}]}]}
2023-09-23T02:51:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jlevin/guanaco-unchained-llama-2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jlevin/guanaco-unchained-llama-2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T03:51:33.627576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jlevin/guanaco-unchained-llama-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jlevin/guanaco-unchained-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T03:51:33.627576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jlevin/guanaco-unchained-llama-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jlevin/guanaco-unchained-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T03:51:33.627576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jlevin/guanaco-unchained-llama-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jlevin/guanaco-unchained-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T03:51:33.627576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
fbe18184e0009405c79c8bdb967df9bdd130d0e0
# Dataset Card for Evaluation run of jlevin/guanaco-13b-llama-2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jlevin/guanaco-13b-llama-2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jlevin/guanaco-13b-llama-2](https://huggingface.co/jlevin/guanaco-13b-llama-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jlevin__guanaco-13b-llama-2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T21:26:51.366980](https://huggingface.co/datasets/open-llm-leaderboard/details_jlevin__guanaco-13b-llama-2/blob/main/results_2023-09-22T21-26-51.366980.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002202181208053691, "em_stderr": 0.00048005108166193844, "f1": 0.05864932885906048, "f1_stderr": 0.0013800780436743112, "acc": 0.4005539821430815, "acc_stderr": 0.009346120664291423 }, "harness|drop|3": { "em": 0.002202181208053691, "em_stderr": 0.00048005108166193844, "f1": 0.05864932885906048, "f1_stderr": 0.0013800780436743112 }, "harness|gsm8k|5": { "acc": 0.0576194086429113, "acc_stderr": 0.006418593319822861 }, "harness|winogrande|5": { "acc": 0.7434885556432518, "acc_stderr": 0.012273648008759986 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jlevin__guanaco-13b-llama-2
[ "region:us" ]
2023-08-17T23:03:35+00:00
{"pretty_name": "Evaluation run of jlevin/guanaco-13b-llama-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jlevin/guanaco-13b-llama-2](https://huggingface.co/jlevin/guanaco-13b-llama-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jlevin__guanaco-13b-llama-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T21:26:51.366980](https://huggingface.co/datasets/open-llm-leaderboard/details_jlevin__guanaco-13b-llama-2/blob/main/results_2023-09-22T21-26-51.366980.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.00048005108166193844,\n \"f1\": 0.05864932885906048,\n \"f1_stderr\": 0.0013800780436743112,\n \"acc\": 0.4005539821430815,\n \"acc_stderr\": 0.009346120664291423\n },\n \"harness|drop|3\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.00048005108166193844,\n \"f1\": 0.05864932885906048,\n \"f1_stderr\": 0.0013800780436743112\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0576194086429113,\n \"acc_stderr\": 0.006418593319822861\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759986\n }\n}\n```", "repo_url": "https://huggingface.co/jlevin/guanaco-13b-llama-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T21_26_51.366980", "path": ["**/details_harness|drop|3_2023-09-22T21-26-51.366980.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T21-26-51.366980.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T21_26_51.366980", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-26-51.366980.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T21-26-51.366980.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:17:23.827759.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:17:23.827759.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:17:23.827759.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T21_26_51.366980", "path": ["**/details_harness|winogrande|5_2023-09-22T21-26-51.366980.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T21-26-51.366980.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T17_17_23.827759", "path": ["results_2023-08-09T17:17:23.827759.parquet"]}, {"split": "2023_09_22T21_26_51.366980", "path": ["results_2023-09-22T21-26-51.366980.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T21-26-51.366980.parquet"]}]}]}
2023-09-22T20:27:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jlevin/guanaco-13b-llama-2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jlevin/guanaco-13b-llama-2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T21:26:51.366980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jlevin/guanaco-13b-llama-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jlevin/guanaco-13b-llama-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:26:51.366980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jlevin/guanaco-13b-llama-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jlevin/guanaco-13b-llama-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T21:26:51.366980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jlevin/guanaco-13b-llama-2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jlevin/guanaco-13b-llama-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T21:26:51.366980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6527dad4531080e17e97932bfc9b643bb99e4c4d
# Dataset Card for Evaluation run of Tincando/fiction_story_generator ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Tincando/fiction_story_generator - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Tincando/fiction_story_generator](https://huggingface.co/Tincando/fiction_story_generator) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Tincando__fiction_story_generator", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T08:16:23.951568](https://huggingface.co/datasets/open-llm-leaderboard/details_Tincando__fiction_story_generator/blob/main/results_2023-10-23T08-16-23.951568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.016778523489932886, "em_stderr": 0.001315352636324007, "f1": 0.04902579697986584, "f1_stderr": 0.0017542824329442046, "acc": 0.2505919494869771, "acc_stderr": 0.007026223145264506 }, "harness|drop|3": { "em": 0.016778523489932886, "em_stderr": 0.001315352636324007, "f1": 0.04902579697986584, "f1_stderr": 0.0017542824329442046 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5011838989739542, "acc_stderr": 0.014052446290529012 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Tincando__fiction_story_generator
[ "region:us" ]
2023-08-17T23:03:44+00:00
{"pretty_name": "Evaluation run of Tincando/fiction_story_generator", "dataset_summary": "Dataset automatically created during the evaluation run of model [Tincando/fiction_story_generator](https://huggingface.co/Tincando/fiction_story_generator) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Tincando__fiction_story_generator\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T08:16:23.951568](https://huggingface.co/datasets/open-llm-leaderboard/details_Tincando__fiction_story_generator/blob/main/results_2023-10-23T08-16-23.951568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016778523489932886,\n \"em_stderr\": 0.001315352636324007,\n \"f1\": 0.04902579697986584,\n \"f1_stderr\": 0.0017542824329442046,\n \"acc\": 0.2505919494869771,\n \"acc_stderr\": 0.007026223145264506\n },\n \"harness|drop|3\": {\n \"em\": 0.016778523489932886,\n \"em_stderr\": 0.001315352636324007,\n \"f1\": 0.04902579697986584,\n \"f1_stderr\": 0.0017542824329442046\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529012\n }\n}\n```", "repo_url": "https://huggingface.co/Tincando/fiction_story_generator", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T08_16_23.951568", "path": ["**/details_harness|drop|3_2023-10-23T08-16-23.951568.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T08-16-23.951568.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T08_16_23.951568", "path": ["**/details_harness|gsm8k|5_2023-10-23T08-16-23.951568.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T08-16-23.951568.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:20:01.774519.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:20:01.774519.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T08_16_23.951568", "path": ["**/details_harness|winogrande|5_2023-10-23T08-16-23.951568.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T08-16-23.951568.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_20_01.774519", "path": ["results_2023-07-19T19:20:01.774519.parquet"]}, {"split": "2023_10_23T08_16_23.951568", "path": ["results_2023-10-23T08-16-23.951568.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T08-16-23.951568.parquet"]}]}]}
2023-10-23T07:16:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Tincando/fiction_story_generator ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Tincando/fiction_story_generator on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T08:16:23.951568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Tincando/fiction_story_generator", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Tincando/fiction_story_generator on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T08:16:23.951568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Tincando/fiction_story_generator", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Tincando/fiction_story_generator on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T08:16:23.951568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Tincando/fiction_story_generator## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Tincando/fiction_story_generator on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T08:16:23.951568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a1b93849d7254021c354acea3d5d442e5313143e
# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [pythainlp/wangchanglm-7.5B-sft-en-sharded](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-12T12:19:58.207629](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded/blob/main/results_2023-10-12T12-19-58.207629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.13527684563758388, "em_stderr": 0.003502595047728489, "f1": 0.1918613674496648, "f1_stderr": 0.003673521698384984, "acc": 0.29237637276332257, "acc_stderr": 0.007586068039653844 }, "harness|drop|3": { "em": 0.13527684563758388, "em_stderr": 0.003502595047728489, "f1": 0.1918613674496648, "f1_stderr": 0.003673521698384984 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674378 }, "harness|winogrande|5": { "acc": 0.5824782951854776, "acc_stderr": 0.013859978264440251 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded
[ "region:us" ]
2023-08-17T23:03:53+00:00
{"pretty_name": "Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded", "dataset_summary": "Dataset automatically created during the evaluation run of model [pythainlp/wangchanglm-7.5B-sft-en-sharded](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T12:19:58.207629](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-en-sharded/blob/main/results_2023-10-12T12-19-58.207629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13527684563758388,\n \"em_stderr\": 0.003502595047728489,\n \"f1\": 0.1918613674496648,\n \"f1_stderr\": 0.003673521698384984,\n \"acc\": 0.29237637276332257,\n \"acc_stderr\": 0.007586068039653844\n },\n \"harness|drop|3\": {\n \"em\": 0.13527684563758388,\n \"em_stderr\": 0.003502595047728489,\n \"f1\": 0.1918613674496648,\n \"f1_stderr\": 0.003673521698384984\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674378\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5824782951854776,\n \"acc_stderr\": 0.013859978264440251\n }\n}\n```", "repo_url": "https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en-sharded", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T12_19_58.207629", "path": ["**/details_harness|drop|3_2023-10-12T12-19-58.207629.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T12-19-58.207629.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T12_19_58.207629", "path": ["**/details_harness|gsm8k|5_2023-10-12T12-19-58.207629.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T12-19-58.207629.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:39:12.796428.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:39:12.796428.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T12_19_58.207629", "path": ["**/details_harness|winogrande|5_2023-10-12T12-19-58.207629.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T12-19-58.207629.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_39_12.796428", "path": ["results_2023-07-19T15:39:12.796428.parquet"]}, {"split": "2023_10_12T12_19_58.207629", "path": ["results_2023-10-12T12-19-58.207629.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T12-19-58.207629.parquet"]}]}]}
2023-10-12T11:20:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-en-sharded on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-12T12:19:58.207629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-en-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T12:19:58.207629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-en-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T12:19:58.207629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-en-sharded## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-en-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T12:19:58.207629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d95af9557f4aefcbbd3057efcbf64a1d69501cf3
# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [pythainlp/wangchanglm-7.5B-sft-enth](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-12T15:30:29.311096](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth/blob/main/results_2023-10-12T15-30-29.311096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0975251677852349, "em_stderr": 0.0030381943660923163, "f1": 0.15908871644295358, "f1_stderr": 0.0032751413358900056, "acc": 0.2923141410254953, "acc_stderr": 0.007937916046478193 }, "harness|drop|3": { "em": 0.0975251677852349, "em_stderr": 0.0030381943660923163, "f1": 0.15908871644295358, "f1_stderr": 0.0032751413358900056 }, "harness|gsm8k|5": { "acc": 0.00530705079605762, "acc_stderr": 0.0020013057209480626 }, "harness|winogrande|5": { "acc": 0.579321231254933, "acc_stderr": 0.013874526372008323 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth
[ "region:us" ]
2023-08-17T23:04:02+00:00
{"pretty_name": "Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth", "dataset_summary": "Dataset automatically created during the evaluation run of model [pythainlp/wangchanglm-7.5B-sft-enth](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T15:30:29.311096](https://huggingface.co/datasets/open-llm-leaderboard/details_pythainlp__wangchanglm-7.5B-sft-enth/blob/main/results_2023-10-12T15-30-29.311096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0975251677852349,\n \"em_stderr\": 0.0030381943660923163,\n \"f1\": 0.15908871644295358,\n \"f1_stderr\": 0.0032751413358900056,\n \"acc\": 0.2923141410254953,\n \"acc_stderr\": 0.007937916046478193\n },\n \"harness|drop|3\": {\n \"em\": 0.0975251677852349,\n \"em_stderr\": 0.0030381943660923163,\n \"f1\": 0.15908871644295358,\n \"f1_stderr\": 0.0032751413358900056\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.0020013057209480626\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.579321231254933,\n \"acc_stderr\": 0.013874526372008323\n }\n}\n```", "repo_url": "https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T15_30_29.311096", "path": ["**/details_harness|drop|3_2023-10-12T15-30-29.311096.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T15-30-29.311096.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T15_30_29.311096", "path": ["**/details_harness|gsm8k|5_2023-10-12T15-30-29.311096.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T15-30-29.311096.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:30:03.574829.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:30:03.574829.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T15_30_29.311096", "path": ["**/details_harness|winogrande|5_2023-10-12T15-30-29.311096.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T15-30-29.311096.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_30_03.574829", "path": ["results_2023-07-18T11:30:03.574829.parquet"]}, {"split": "2023_10_12T15_30_29.311096", "path": ["results_2023-10-12T15-30-29.311096.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T15-30-29.311096.parquet"]}]}]}
2023-10-12T14:30:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-enth on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-12T15:30:29.311096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-enth on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T15:30:29.311096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-enth on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T15:30:29.311096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of pythainlp/wangchanglm-7.5B-sft-enth## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model pythainlp/wangchanglm-7.5B-sft-enth on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T15:30:29.311096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9c8aba40b22cdaa067c073f454dcee7a66494a85
# Dataset Card for Evaluation run of lgaalves/gpt2-dolly ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/lgaalves/gpt2-dolly - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [lgaalves/gpt2-dolly](https://huggingface.co/lgaalves/gpt2-dolly) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2-dolly", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T15:16:18.909977](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2-dolly/blob/main/results_2023-10-26T15-16-18.909977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0008389261744966443, "em_stderr": 0.00029649629898012396, "f1": 0.034500838926174546, "f1_stderr": 0.0010901499685640162, "acc": 0.25805886045310694, "acc_stderr": 0.007559135865912546 }, "harness|drop|3": { "em": 0.0008389261744966443, "em_stderr": 0.00029649629898012396, "f1": 0.034500838926174546, "f1_stderr": 0.0010901499685640162 }, "harness|gsm8k|5": { "acc": 0.001516300227445034, "acc_stderr": 0.0010717793485492627 }, "harness|winogrande|5": { "acc": 0.5146014206787688, "acc_stderr": 0.01404649238327583 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_lgaalves__gpt2-dolly
[ "region:us" ]
2023-08-17T23:04:11+00:00
{"pretty_name": "Evaluation run of lgaalves/gpt2-dolly", "dataset_summary": "Dataset automatically created during the evaluation run of model [lgaalves/gpt2-dolly](https://huggingface.co/lgaalves/gpt2-dolly) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2-dolly\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-26T15:16:18.909977](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2-dolly/blob/main/results_2023-10-26T15-16-18.909977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012396,\n \"f1\": 0.034500838926174546,\n \"f1_stderr\": 0.0010901499685640162,\n \"acc\": 0.25805886045310694,\n \"acc_stderr\": 0.007559135865912546\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012396,\n \"f1\": 0.034500838926174546,\n \"f1_stderr\": 0.0010901499685640162\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492627\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5146014206787688,\n \"acc_stderr\": 0.01404649238327583\n }\n}\n```", "repo_url": "https://huggingface.co/lgaalves/gpt2-dolly", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|arc:challenge|25_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|arc:challenge|25_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T15_21_03.106621", "path": ["**/details_harness|drop|3_2023-10-16T15-21-03.106621.parquet"]}, {"split": "2023_10_26T15_16_18.909977", "path": ["**/details_harness|drop|3_2023-10-26T15-16-18.909977.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-26T15-16-18.909977.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T15_21_03.106621", "path": ["**/details_harness|gsm8k|5_2023-10-16T15-21-03.106621.parquet"]}, {"split": "2023_10_26T15_16_18.909977", "path": ["**/details_harness|gsm8k|5_2023-10-26T15-16-18.909977.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-26T15-16-18.909977.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hellaswag|10_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hellaswag|10_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T12:04:01.298115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T18-57-43.248355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T18-57-43.248355.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T18-57-43.248355.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T15_21_03.106621", "path": ["**/details_harness|winogrande|5_2023-10-16T15-21-03.106621.parquet"]}, {"split": "2023_10_26T15_16_18.909977", "path": ["**/details_harness|winogrande|5_2023-10-26T15-16-18.909977.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-26T15-16-18.909977.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T12_04_01.298115", "path": ["results_2023-08-09T12:04:01.298115.parquet"]}, {"split": "2023_09_21T18_57_43.248355", "path": ["results_2023-09-21T18-57-43.248355.parquet"]}, {"split": "2023_10_16T15_21_03.106621", "path": ["results_2023-10-16T15-21-03.106621.parquet"]}, {"split": "2023_10_26T15_16_18.909977", "path": ["results_2023-10-26T15-16-18.909977.parquet"]}, {"split": "latest", "path": ["results_2023-10-26T15-16-18.909977.parquet"]}]}]}
2023-10-26T14:16:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of lgaalves/gpt2-dolly ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model lgaalves/gpt2-dolly on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-26T15:16:18.909977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of lgaalves/gpt2-dolly", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2-dolly on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T15:16:18.909977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of lgaalves/gpt2-dolly", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2-dolly on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T15:16:18.909977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lgaalves/gpt2-dolly## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2-dolly on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-26T15:16:18.909977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
df5b5396d5bc308a6f8875a8127073f625d0814a
# Dataset Card for Evaluation run of Rachneet/gpt2-xl-alpaca ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Rachneet/gpt2-xl-alpaca - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Rachneet/gpt2-xl-alpaca](https://huggingface.co/Rachneet/gpt2-xl-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T05:57:01.634897](https://huggingface.co/datasets/open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca/blob/main/results_2023-10-15T05-57-01.634897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.00576761744966443, "em_stderr": 0.0007755000442814736, "f1": 0.06548028523489936, "f1_stderr": 0.001565882245526754, "acc": 0.2845303867403315, "acc_stderr": 0.00695889831166798 }, "harness|drop|3": { "em": 0.00576761744966443, "em_stderr": 0.0007755000442814736, "f1": 0.06548028523489936, "f1_stderr": 0.001565882245526754 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.569060773480663, "acc_stderr": 0.01391779662333596 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca
[ "region:us" ]
2023-08-17T23:04:20+00:00
{"pretty_name": "Evaluation run of Rachneet/gpt2-xl-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [Rachneet/gpt2-xl-alpaca](https://huggingface.co/Rachneet/gpt2-xl-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T05:57:01.634897](https://huggingface.co/datasets/open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca/blob/main/results_2023-10-15T05-57-01.634897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00576761744966443,\n \"em_stderr\": 0.0007755000442814736,\n \"f1\": 0.06548028523489936,\n \"f1_stderr\": 0.001565882245526754,\n \"acc\": 0.2845303867403315,\n \"acc_stderr\": 0.00695889831166798\n },\n \"harness|drop|3\": {\n \"em\": 0.00576761744966443,\n \"em_stderr\": 0.0007755000442814736,\n \"f1\": 0.06548028523489936,\n \"f1_stderr\": 0.001565882245526754\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.569060773480663,\n \"acc_stderr\": 0.01391779662333596\n }\n}\n```", "repo_url": "https://huggingface.co/Rachneet/gpt2-xl-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|arc:challenge|25_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T05_57_01.634897", "path": ["**/details_harness|drop|3_2023-10-15T05-57-01.634897.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T05-57-01.634897.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T05_57_01.634897", "path": ["**/details_harness|gsm8k|5_2023-10-15T05-57-01.634897.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T05-57-01.634897.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hellaswag|10_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T18:01:10.182884.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T18:01:10.182884.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T05_57_01.634897", "path": ["**/details_harness|winogrande|5_2023-10-15T05-57-01.634897.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T05-57-01.634897.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T18_01_10.182884", "path": ["results_2023-07-18T18:01:10.182884.parquet"]}, {"split": "2023_10_15T05_57_01.634897", "path": ["results_2023-10-15T05-57-01.634897.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T05-57-01.634897.parquet"]}]}]}
2023-10-15T04:57:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Rachneet/gpt2-xl-alpaca ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Rachneet/gpt2-xl-alpaca on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T05:57:01.634897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Rachneet/gpt2-xl-alpaca", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rachneet/gpt2-xl-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T05:57:01.634897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Rachneet/gpt2-xl-alpaca", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rachneet/gpt2-xl-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T05:57:01.634897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Rachneet/gpt2-xl-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rachneet/gpt2-xl-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T05:57:01.634897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ab7e5cadacb6c9f2245daa67014b7f5cd5762c6f
# Dataset Card for Evaluation run of databricks/dolly-v2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/databricks/dolly-v2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [databricks/dolly-v2-7b](https://huggingface.co/databricks/dolly-v2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_databricks__dolly-v2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T13:27:34.576106](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-7b/blob/main/results_2023-10-15T13-27-34.576106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0018875838926174498, "em_stderr": 0.00044451099905589976, "f1": 0.059697986577181554, "f1_stderr": 0.0013648879248414308, "acc": 0.3060018322459733, "acc_stderr": 0.008342799872753168 }, "harness|drop|3": { "em": 0.0018875838926174498, "em_stderr": 0.00044451099905589976, "f1": 0.059697986577181554, "f1_stderr": 0.0013648879248414308 }, "harness|gsm8k|5": { "acc": 0.011372251705837756, "acc_stderr": 0.002920666198788728 }, "harness|winogrande|5": { "acc": 0.6006314127861089, "acc_stderr": 0.013764933546717607 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_databricks__dolly-v2-7b
[ "region:us" ]
2023-08-17T23:04:29+00:00
{"pretty_name": "Evaluation run of databricks/dolly-v2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [databricks/dolly-v2-7b](https://huggingface.co/databricks/dolly-v2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_databricks__dolly-v2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T13:27:34.576106](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-7b/blob/main/results_2023-10-15T13-27-34.576106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589976,\n \"f1\": 0.059697986577181554,\n \"f1_stderr\": 0.0013648879248414308,\n \"acc\": 0.3060018322459733,\n \"acc_stderr\": 0.008342799872753168\n },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589976,\n \"f1\": 0.059697986577181554,\n \"f1_stderr\": 0.0013648879248414308\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.002920666198788728\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6006314127861089,\n \"acc_stderr\": 0.013764933546717607\n }\n}\n```", "repo_url": "https://huggingface.co/databricks/dolly-v2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T13_27_34.576106", "path": ["**/details_harness|drop|3_2023-10-15T13-27-34.576106.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T13-27-34.576106.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T13_27_34.576106", "path": ["**/details_harness|gsm8k|5_2023-10-15T13-27-34.576106.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T13-27-34.576106.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:46:56.588473.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:46:56.588473.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T13_27_34.576106", "path": ["**/details_harness|winogrande|5_2023-10-15T13-27-34.576106.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T13-27-34.576106.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_46_56.588473", "path": ["results_2023-07-18T11:46:56.588473.parquet"]}, {"split": "2023_10_15T13_27_34.576106", "path": ["results_2023-10-15T13-27-34.576106.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T13-27-34.576106.parquet"]}]}]}
2023-10-15T12:27:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of databricks/dolly-v2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model databricks/dolly-v2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T13:27:34.576106(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of databricks/dolly-v2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T13:27:34.576106(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of databricks/dolly-v2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T13:27:34.576106(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of databricks/dolly-v2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T13:27:34.576106(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
ef200f8a9af014243e364dafe1396929fc7a2a28
# Dataset Card for Evaluation run of databricks/dolly-v2-12b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/databricks/dolly-v2-12b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [databricks/dolly-v2-12b](https://huggingface.co/databricks/dolly-v2-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_databricks__dolly-v2-12b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T05:02:42.236847](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-12b/blob/main/results_2023-09-23T05-02-42.236847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0016778523489932886, "em_stderr": 0.0004191330178826844, "f1": 0.06285968959731549, "f1_stderr": 0.0014820300080071475, "acc": 0.31032723721601535, "acc_stderr": 0.008366390657090902 }, "harness|drop|3": { "em": 0.0016778523489932886, "em_stderr": 0.0004191330178826844, "f1": 0.06285968959731549, "f1_stderr": 0.0014820300080071475 }, "harness|gsm8k|5": { "acc": 0.012130401819560273, "acc_stderr": 0.0030152942428909495 }, "harness|winogrande|5": { "acc": 0.6085240726124704, "acc_stderr": 0.013717487071290854 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_databricks__dolly-v2-12b
[ "region:us" ]
2023-08-17T23:04:38+00:00
{"pretty_name": "Evaluation run of databricks/dolly-v2-12b", "dataset_summary": "Dataset automatically created during the evaluation run of model [databricks/dolly-v2-12b](https://huggingface.co/databricks/dolly-v2-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_databricks__dolly-v2-12b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T05:02:42.236847](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-12b/blob/main/results_2023-09-23T05-02-42.236847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826844,\n \"f1\": 0.06285968959731549,\n \"f1_stderr\": 0.0014820300080071475,\n \"acc\": 0.31032723721601535,\n \"acc_stderr\": 0.008366390657090902\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826844,\n \"f1\": 0.06285968959731549,\n \"f1_stderr\": 0.0014820300080071475\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \"acc_stderr\": 0.0030152942428909495\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6085240726124704,\n \"acc_stderr\": 0.013717487071290854\n }\n}\n```", "repo_url": "https://huggingface.co/databricks/dolly-v2-12b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|arc:challenge|25_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T05_02_42.236847", "path": ["**/details_harness|drop|3_2023-09-23T05-02-42.236847.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T05-02-42.236847.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T05_02_42.236847", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-02-42.236847.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T05-02-42.236847.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hellaswag|10_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T13:43:42.069045.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T13:43:42.069045.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T05_02_42.236847", "path": ["**/details_harness|winogrande|5_2023-09-23T05-02-42.236847.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T05-02-42.236847.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T13_43_42.069045", "path": ["results_2023-07-18T13:43:42.069045.parquet"]}, {"split": "2023_09_23T05_02_42.236847", "path": ["results_2023-09-23T05-02-42.236847.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T05-02-42.236847.parquet"]}]}]}
2023-09-23T04:02:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of databricks/dolly-v2-12b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model databricks/dolly-v2-12b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T05:02:42.236847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of databricks/dolly-v2-12b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-12b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:02:42.236847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of databricks/dolly-v2-12b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-12b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T05:02:42.236847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of databricks/dolly-v2-12b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-12b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T05:02:42.236847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2c139f841dbc37885e93b0c2806abc865b715972
# Dataset Card for Evaluation run of databricks/dolly-v2-3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/databricks/dolly-v2-3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [databricks/dolly-v2-3b](https://huggingface.co/databricks/dolly-v2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_databricks__dolly-v2-3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T08:28:44.127308](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-3b/blob/main/results_2023-10-15T08-28-44.127308.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001572986577181208, "em_stderr": 0.00040584511324177506, "f1": 0.05171245805369139, "f1_stderr": 0.0012518561178042446, "acc": 0.30246569325856754, "acc_stderr": 0.008311459829200955 }, "harness|drop|3": { "em": 0.001572986577181208, "em_stderr": 0.00040584511324177506, "f1": 0.05171245805369139, "f1_stderr": 0.0012518561178042446 }, "harness|gsm8k|5": { "acc": 0.01061410159211524, "acc_stderr": 0.002822713322387704 }, "harness|winogrande|5": { "acc": 0.5943172849250198, "acc_stderr": 0.013800206336014207 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_databricks__dolly-v2-3b
[ "region:us" ]
2023-08-17T23:04:47+00:00
{"pretty_name": "Evaluation run of databricks/dolly-v2-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [databricks/dolly-v2-3b](https://huggingface.co/databricks/dolly-v2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_databricks__dolly-v2-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T08:28:44.127308](https://huggingface.co/datasets/open-llm-leaderboard/details_databricks__dolly-v2-3b/blob/main/results_2023-10-15T08-28-44.127308.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177506,\n \"f1\": 0.05171245805369139,\n \"f1_stderr\": 0.0012518561178042446,\n \"acc\": 0.30246569325856754,\n \"acc_stderr\": 0.008311459829200955\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177506,\n \"f1\": 0.05171245805369139,\n \"f1_stderr\": 0.0012518561178042446\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.002822713322387704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5943172849250198,\n \"acc_stderr\": 0.013800206336014207\n }\n}\n```", "repo_url": "https://huggingface.co/databricks/dolly-v2-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T19_06_58.415048", "path": ["**/details_harness|drop|3_2023-10-12T19-06-58.415048.parquet"]}, {"split": "2023_10_15T08_28_44.127308", "path": ["**/details_harness|drop|3_2023-10-15T08-28-44.127308.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T08-28-44.127308.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T19_06_58.415048", "path": ["**/details_harness|gsm8k|5_2023-10-12T19-06-58.415048.parquet"]}, {"split": "2023_10_15T08_28_44.127308", "path": ["**/details_harness|gsm8k|5_2023-10-15T08-28-44.127308.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T08-28-44.127308.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:08:45.552470.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:08:45.552470.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:08:45.552470.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T19_06_58.415048", "path": ["**/details_harness|winogrande|5_2023-10-12T19-06-58.415048.parquet"]}, {"split": "2023_10_15T08_28_44.127308", "path": ["**/details_harness|winogrande|5_2023-10-15T08-28-44.127308.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T08-28-44.127308.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_08_45.552470", "path": ["results_2023-07-19T15:08:45.552470.parquet"]}, {"split": "2023_10_12T19_06_58.415048", "path": ["results_2023-10-12T19-06-58.415048.parquet"]}, {"split": "2023_10_15T08_28_44.127308", "path": ["results_2023-10-15T08-28-44.127308.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T08-28-44.127308.parquet"]}]}]}
2023-10-15T07:28:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of databricks/dolly-v2-3b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model databricks/dolly-v2-3b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T08:28:44.127308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of databricks/dolly-v2-3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T08:28:44.127308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of databricks/dolly-v2-3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T08:28:44.127308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of databricks/dolly-v2-3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model databricks/dolly-v2-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T08:28:44.127308(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
efaaeec18de43018a18ccd77fdf5f039e04a3adf
# Dataset Card for Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [abhiramtirumala/DialoGPT-sarcastic-medium](https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T00:36:45.634956](https://huggingface.co/datasets/open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium/blob/main/results_2023-09-23T00-36-45.634956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0, "acc": 0.26677190213101815, "acc_stderr": 0.007010413338799049 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 0.0, "f1_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5335438042620363, "acc_stderr": 0.014020826677598098 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium
[ "region:us" ]
2023-08-17T23:04:56+00:00
{"pretty_name": "Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhiramtirumala/DialoGPT-sarcastic-medium](https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T00:36:45.634956](https://huggingface.co/datasets/open-llm-leaderboard/details_abhiramtirumala__DialoGPT-sarcastic-medium/blob/main/results_2023-09-23T00-36-45.634956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"acc\": 0.26677190213101815,\n \"acc_stderr\": 0.007010413338799049\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5335438042620363,\n \"acc_stderr\": 0.014020826677598098\n }\n}\n```", "repo_url": "https://huggingface.co/abhiramtirumala/DialoGPT-sarcastic-medium", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T18_48_39.393988", "path": ["**/details_harness|drop|3_2023-09-17T18-48-39.393988.parquet"]}, {"split": "2023_09_23T00_36_45.634956", "path": ["**/details_harness|drop|3_2023-09-23T00-36-45.634956.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T00-36-45.634956.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T18_48_39.393988", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-48-39.393988.parquet"]}, {"split": "2023_09_23T00_36_45.634956", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-36-45.634956.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-36-45.634956.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:39:52.332273.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:39:52.332273.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T18_48_39.393988", "path": ["**/details_harness|winogrande|5_2023-09-17T18-48-39.393988.parquet"]}, {"split": "2023_09_23T00_36_45.634956", "path": ["**/details_harness|winogrande|5_2023-09-23T00-36-45.634956.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T00-36-45.634956.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T10_39_52.332273", "path": ["results_2023-07-19T10:39:52.332273.parquet"]}, {"split": "2023_09_17T18_48_39.393988", "path": ["results_2023-09-17T18-48-39.393988.parquet"]}, {"split": "2023_09_23T00_36_45.634956", "path": ["results_2023-09-23T00-36-45.634956.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T00-36-45.634956.parquet"]}]}]}
2023-09-22T23:36:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model abhiramtirumala/DialoGPT-sarcastic-medium on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T00:36:45.634956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model abhiramtirumala/DialoGPT-sarcastic-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:36:45.634956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model abhiramtirumala/DialoGPT-sarcastic-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:36:45.634956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abhiramtirumala/DialoGPT-sarcastic-medium## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model abhiramtirumala/DialoGPT-sarcastic-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T00:36:45.634956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
98d4b640c6e687c7e58066d4c868a2dc3046b0ab
# Dataset Card for Evaluation run of lilloukas/Platypus-30B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/lilloukas/Platypus-30B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [lilloukas/Platypus-30B](https://huggingface.co/lilloukas/Platypus-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_lilloukas__Platypus-30B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T05:57:25.138979](https://huggingface.co/datasets/open-llm-leaderboard/details_lilloukas__Platypus-30B/blob/main/results_2023-09-17T05-57-25.138979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4152684563758389, "em_stderr": 0.005046408282247135, "f1": 0.4565257969798663, "f1_stderr": 0.004890389225361096, "acc": 0.4788908748525736, "acc_stderr": 0.010306994464370747 }, "harness|drop|3": { "em": 0.4152684563758389, "em_stderr": 0.005046408282247135, "f1": 0.4565257969798663, "f1_stderr": 0.004890389225361096 }, "harness|gsm8k|5": { "acc": 0.14404852160727824, "acc_stderr": 0.009672110973065282 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.010941877955676211 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_lilloukas__Platypus-30B
[ "region:us" ]
2023-08-17T23:05:06+00:00
{"pretty_name": "Evaluation run of lilloukas/Platypus-30B", "dataset_summary": "Dataset automatically created during the evaluation run of model [lilloukas/Platypus-30B](https://huggingface.co/lilloukas/Platypus-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lilloukas__Platypus-30B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T05:57:25.138979](https://huggingface.co/datasets/open-llm-leaderboard/details_lilloukas__Platypus-30B/blob/main/results_2023-09-17T05-57-25.138979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4152684563758389,\n \"em_stderr\": 0.005046408282247135,\n \"f1\": 0.4565257969798663,\n \"f1_stderr\": 0.004890389225361096,\n \"acc\": 0.4788908748525736,\n \"acc_stderr\": 0.010306994464370747\n },\n \"harness|drop|3\": {\n \"em\": 0.4152684563758389,\n \"em_stderr\": 0.005046408282247135,\n \"f1\": 0.4565257969798663,\n \"f1_stderr\": 0.004890389225361096\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14404852160727824,\n \"acc_stderr\": 0.009672110973065282\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676211\n }\n}\n```", "repo_url": "https://huggingface.co/lilloukas/Platypus-30B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T05_57_25.138979", "path": ["**/details_harness|drop|3_2023-09-17T05-57-25.138979.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T05-57-25.138979.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T05_57_25.138979", "path": ["**/details_harness|gsm8k|5_2023-09-17T05-57-25.138979.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T05-57-25.138979.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:45:02.696603.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:45:02.696603.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:45:02.696603.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T05_57_25.138979", "path": ["**/details_harness|winogrande|5_2023-09-17T05-57-25.138979.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T05-57-25.138979.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_45_02.696603", "path": ["results_2023-07-19T22:45:02.696603.parquet"]}, {"split": "2023_09_17T05_57_25.138979", "path": ["results_2023-09-17T05-57-25.138979.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T05-57-25.138979.parquet"]}]}]}
2023-09-17T04:57:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of lilloukas/Platypus-30B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model lilloukas/Platypus-30B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T05:57:25.138979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of lilloukas/Platypus-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lilloukas/Platypus-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T05:57:25.138979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of lilloukas/Platypus-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lilloukas/Platypus-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T05:57:25.138979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lilloukas/Platypus-30B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lilloukas/Platypus-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T05:57:25.138979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0082b67ebaa339f9c0102fac7c40c043b8337908
# Dataset Card for Evaluation run of lilloukas/GPlatty-30B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/lilloukas/GPlatty-30B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [lilloukas/GPlatty-30B](https://huggingface.co/lilloukas/GPlatty-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_lilloukas__GPlatty-30B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T00:38:16.456797](https://huggingface.co/datasets/open-llm-leaderboard/details_lilloukas__GPlatty-30B/blob/main/results_2023-09-23T00-38-16.456797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4629823825503356, "em_stderr": 0.005106415513013176, "f1": 0.5073416526845649, "f1_stderr": 0.004906633817362961, "acc": 0.4742641844979544, "acc_stderr": 0.010275992859707792 }, "harness|drop|3": { "em": 0.4629823825503356, "em_stderr": 0.005106415513013176, "f1": 0.5073416526845649, "f1_stderr": 0.004906633817362961 }, "harness|gsm8k|5": { "acc": 0.13874147081122062, "acc_stderr": 0.009521649920798146 }, "harness|winogrande|5": { "acc": 0.8097868981846882, "acc_stderr": 0.01103033579861744 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_lilloukas__GPlatty-30B
[ "region:us" ]
2023-08-17T23:05:15+00:00
{"pretty_name": "Evaluation run of lilloukas/GPlatty-30B", "dataset_summary": "Dataset automatically created during the evaluation run of model [lilloukas/GPlatty-30B](https://huggingface.co/lilloukas/GPlatty-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lilloukas__GPlatty-30B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T00:38:16.456797](https://huggingface.co/datasets/open-llm-leaderboard/details_lilloukas__GPlatty-30B/blob/main/results_2023-09-23T00-38-16.456797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4629823825503356,\n \"em_stderr\": 0.005106415513013176,\n \"f1\": 0.5073416526845649,\n \"f1_stderr\": 0.004906633817362961,\n \"acc\": 0.4742641844979544,\n \"acc_stderr\": 0.010275992859707792\n },\n \"harness|drop|3\": {\n \"em\": 0.4629823825503356,\n \"em_stderr\": 0.005106415513013176,\n \"f1\": 0.5073416526845649,\n \"f1_stderr\": 0.004906633817362961\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13874147081122062,\n \"acc_stderr\": 0.009521649920798146\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n }\n}\n```", "repo_url": "https://huggingface.co/lilloukas/GPlatty-30B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T00_38_16.456797", "path": ["**/details_harness|drop|3_2023-09-23T00-38-16.456797.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T00-38-16.456797.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T00_38_16.456797", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-38-16.456797.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-38-16.456797.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:09:17.218494.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:25:28.445280.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:25:28.445280.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:25:28.445280.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T00_38_16.456797", "path": ["**/details_harness|winogrande|5_2023-09-23T00-38-16.456797.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T00-38-16.456797.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_09_17.218494", "path": ["results_2023-07-19T13:09:17.218494.parquet"]}, {"split": "2023_07_19T22_25_28.445280", "path": ["results_2023-07-19T22:25:28.445280.parquet"]}, {"split": "2023_09_23T00_38_16.456797", "path": ["results_2023-09-23T00-38-16.456797.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T00-38-16.456797.parquet"]}]}]}
2023-09-22T23:38:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of lilloukas/GPlatty-30B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model lilloukas/GPlatty-30B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T00:38:16.456797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of lilloukas/GPlatty-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lilloukas/GPlatty-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:38:16.456797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of lilloukas/GPlatty-30B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lilloukas/GPlatty-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T00:38:16.456797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lilloukas/GPlatty-30B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lilloukas/GPlatty-30B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T00:38:16.456797(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c22e209efe8581665dd39f1fdb8b27b19a73ac10
# Dataset Card for Evaluation run of OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16](https://huggingface.co/OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenBuddyEA__openbuddy-llama-30b-v7.1-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T10:49:57.900562](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddyEA__openbuddy-llama-30b-v7.1-bf16/blob/main/results_2023-09-23T10-49-57.900562.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.412751677852349, "em_stderr": 0.005041908586049675, "f1": 0.47628145973154495, "f1_stderr": 0.004825773123830683, "acc": 0.5456038961854937, "acc_stderr": 0.012271337118789107 }, "harness|drop|3": { "em": 0.412751677852349, "em_stderr": 0.005041908586049675, "f1": 0.47628145973154495, "f1_stderr": 0.004825773123830683 }, "harness|gsm8k|5": { "acc": 0.3161485974222896, "acc_stderr": 0.012807630673451477 }, "harness|winogrande|5": { "acc": 0.7750591949486977, "acc_stderr": 0.011735043564126735 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_OpenBuddyEA__openbuddy-llama-30b-v7.1-bf16
[ "region:us" ]
2023-08-17T23:05:26+00:00
{"pretty_name": "Evaluation run of OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16](https://huggingface.co/OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddyEA__openbuddy-llama-30b-v7.1-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T10:49:57.900562](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddyEA__openbuddy-llama-30b-v7.1-bf16/blob/main/results_2023-09-23T10-49-57.900562.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.412751677852349,\n \"em_stderr\": 0.005041908586049675,\n \"f1\": 0.47628145973154495,\n \"f1_stderr\": 0.004825773123830683,\n \"acc\": 0.5456038961854937,\n \"acc_stderr\": 0.012271337118789107\n },\n \"harness|drop|3\": {\n \"em\": 0.412751677852349,\n \"em_stderr\": 0.005041908586049675,\n \"f1\": 0.47628145973154495,\n \"f1_stderr\": 0.004825773123830683\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3161485974222896,\n \"acc_stderr\": 0.012807630673451477\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|arc:challenge|25_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|arc:challenge|25_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T07_22_17.332814", "path": ["**/details_harness|drop|3_2023-09-17T07-22-17.332814.parquet"]}, {"split": "2023_09_23T10_49_57.900562", "path": ["**/details_harness|drop|3_2023-09-23T10-49-57.900562.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T10-49-57.900562.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T07_22_17.332814", "path": ["**/details_harness|gsm8k|5_2023-09-17T07-22-17.332814.parquet"]}, {"split": "2023_09_23T10_49_57.900562", "path": ["**/details_harness|gsm8k|5_2023-09-23T10-49-57.900562.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T10-49-57.900562.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hellaswag|10_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hellaswag|10_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T17:39:33.477068.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T17:52:33.029329.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T17:52:33.029329.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T17:52:33.029329.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T07_22_17.332814", "path": ["**/details_harness|winogrande|5_2023-09-17T07-22-17.332814.parquet"]}, {"split": "2023_09_23T10_49_57.900562", "path": ["**/details_harness|winogrande|5_2023-09-23T10-49-57.900562.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T10-49-57.900562.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T17_39_33.477068", "path": ["results_2023-07-25T17:39:33.477068.parquet"]}, {"split": "2023_07_25T17_52_33.029329", "path": ["results_2023-07-25T17:52:33.029329.parquet"]}, {"split": "2023_09_17T07_22_17.332814", "path": ["results_2023-09-17T07-22-17.332814.parquet"]}, {"split": "2023_09_23T10_49_57.900562", "path": ["results_2023-09-23T10-49-57.900562.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T10-49-57.900562.parquet"]}]}]}
2023-09-23T09:50:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T10:49:57.900562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T10:49:57.900562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T10:49:57.900562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 28, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddyEA/openbuddy-llama-30b-v7.1-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T10:49:57.900562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
027e3b8ac2514673ce7ab9f8f99bc4c85aff8033
# Dataset Card for Evaluation run of LLMs/WizardLM-13B-V1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LLMs/WizardLM-13B-V1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [LLMs/WizardLM-13B-V1.0](https://huggingface.co/LLMs/WizardLM-13B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LLMs__WizardLM-13B-V1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T04:14:08.122970](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__WizardLM-13B-V1.0/blob/main/results_2023-09-23T04-14-08.122970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.06669463087248322, "em_stderr": 0.0025550356847442663, "f1": 0.1407225251677848, "f1_stderr": 0.002886547618779489, "acc": 0.44106833942386575, "acc_stderr": 0.010948605580118738 }, "harness|drop|3": { "em": 0.06669463087248322, "em_stderr": 0.0025550356847442663, "f1": 0.1407225251677848, "f1_stderr": 0.002886547618779489 }, "harness|gsm8k|5": { "acc": 0.14101592115238817, "acc_stderr": 0.009586695349244096 }, "harness|winogrande|5": { "acc": 0.7411207576953434, "acc_stderr": 0.01231051581099338 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_LLMs__WizardLM-13B-V1.0
[ "region:us" ]
2023-08-17T23:05:43+00:00
{"pretty_name": "Evaluation run of LLMs/WizardLM-13B-V1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [LLMs/WizardLM-13B-V1.0](https://huggingface.co/LLMs/WizardLM-13B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LLMs__WizardLM-13B-V1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T04:14:08.122970](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__WizardLM-13B-V1.0/blob/main/results_2023-09-23T04-14-08.122970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06669463087248322,\n \"em_stderr\": 0.0025550356847442663,\n \"f1\": 0.1407225251677848,\n \"f1_stderr\": 0.002886547618779489,\n \"acc\": 0.44106833942386575,\n \"acc_stderr\": 0.010948605580118738\n },\n \"harness|drop|3\": {\n \"em\": 0.06669463087248322,\n \"em_stderr\": 0.0025550356847442663,\n \"f1\": 0.1407225251677848,\n \"f1_stderr\": 0.002886547618779489\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14101592115238817,\n \"acc_stderr\": 0.009586695349244096\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.01231051581099338\n }\n}\n```", "repo_url": "https://huggingface.co/LLMs/WizardLM-13B-V1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|arc:challenge|25_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T04_14_08.122970", "path": ["**/details_harness|drop|3_2023-09-23T04-14-08.122970.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T04-14-08.122970.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T04_14_08.122970", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-14-08.122970.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-14-08.122970.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hellaswag|10_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T12:48:39.011618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T12:48:39.011618.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T12:48:39.011618.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T04_14_08.122970", "path": ["**/details_harness|winogrande|5_2023-09-23T04-14-08.122970.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T04-14-08.122970.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T12_48_39.011618", "path": ["results_2023-07-24T12:48:39.011618.parquet"]}, {"split": "2023_09_23T04_14_08.122970", "path": ["results_2023-09-23T04-14-08.122970.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T04-14-08.122970.parquet"]}]}]}
2023-09-23T03:14:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of LLMs/WizardLM-13B-V1.0 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model LLMs/WizardLM-13B-V1.0 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-23T04:14:08.122970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of LLMs/WizardLM-13B-V1.0", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LLMs/WizardLM-13B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T04:14:08.122970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of LLMs/WizardLM-13B-V1.0", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LLMs/WizardLM-13B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-23T04:14:08.122970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of LLMs/WizardLM-13B-V1.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model LLMs/WizardLM-13B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T04:14:08.122970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
238df92867c0406711b3d38e6f64d32a7bc19a02
# Dataset Card for Evaluation run of LLMs/AlpacaGPT4-7B-elina ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LLMs/AlpacaGPT4-7B-elina - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [LLMs/AlpacaGPT4-7B-elina](https://huggingface.co/LLMs/AlpacaGPT4-7B-elina) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T04:06:06.586475](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina/blob/main/results_2023-10-15T04-06-06.586475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.003460570469798658, "em_stderr": 0.0006013962884271144, "f1": 0.06020763422818805, "f1_stderr": 0.001415436583944496, "acc": 0.38620148841562185, "acc_stderr": 0.009130838881295832 }, "harness|drop|3": { "em": 0.003460570469798658, "em_stderr": 0.0006013962884271144, "f1": 0.06020763422818805, "f1_stderr": 0.001415436583944496 }, "harness|gsm8k|5": { "acc": 0.045489006823351025, "acc_stderr": 0.005739657656722211 }, "harness|winogrande|5": { "acc": 0.7269139700078927, "acc_stderr": 0.012522020105869454 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina
[ "region:us" ]
2023-08-17T23:05:52+00:00
{"pretty_name": "Evaluation run of LLMs/AlpacaGPT4-7B-elina", "dataset_summary": "Dataset automatically created during the evaluation run of model [LLMs/AlpacaGPT4-7B-elina](https://huggingface.co/LLMs/AlpacaGPT4-7B-elina) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T04:06:06.586475](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina/blob/main/results_2023-10-15T04-06-06.586475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003460570469798658,\n \"em_stderr\": 0.0006013962884271144,\n \"f1\": 0.06020763422818805,\n \"f1_stderr\": 0.001415436583944496,\n \"acc\": 0.38620148841562185,\n \"acc_stderr\": 0.009130838881295832\n },\n \"harness|drop|3\": {\n \"em\": 0.003460570469798658,\n \"em_stderr\": 0.0006013962884271144,\n \"f1\": 0.06020763422818805,\n \"f1_stderr\": 0.001415436583944496\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \"acc_stderr\": 0.005739657656722211\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869454\n }\n}\n```", "repo_url": "https://huggingface.co/LLMs/AlpacaGPT4-7B-elina", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T04_06_06.586475", "path": ["**/details_harness|drop|3_2023-10-15T04-06-06.586475.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T04-06-06.586475.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T04_06_06.586475", "path": ["**/details_harness|gsm8k|5_2023-10-15T04-06-06.586475.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T04-06-06.586475.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:21:37.483871.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:21:37.483871.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T04_06_06.586475", "path": ["**/details_harness|winogrande|5_2023-10-15T04-06-06.586475.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T04-06-06.586475.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T12_21_37.483871", "path": ["results_2023-07-18T12:21:37.483871.parquet"]}, {"split": "2023_10_15T04_06_06.586475", "path": ["results_2023-10-15T04-06-06.586475.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T04-06-06.586475.parquet"]}]}]}
2023-10-15T03:06:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of LLMs/AlpacaGPT4-7B-elina ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model LLMs/AlpacaGPT4-7B-elina on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T04:06:06.586475(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of LLMs/AlpacaGPT4-7B-elina", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LLMs/AlpacaGPT4-7B-elina on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T04:06:06.586475(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of LLMs/AlpacaGPT4-7B-elina", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model LLMs/AlpacaGPT4-7B-elina on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T04:06:06.586475(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of LLMs/AlpacaGPT4-7B-elina## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model LLMs/AlpacaGPT4-7B-elina on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T04:06:06.586475(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d7d6314dac7dda8025883ef5543f4044a9b33da1
# Dataset Card for Evaluation run of shareAI/bimoGPT-llama2-13b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/shareAI/bimoGPT-llama2-13b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [shareAI/bimoGPT-llama2-13b](https://huggingface.co/shareAI/bimoGPT-llama2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T02:26:34.744739](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b/blob/main/results_2023-09-17T02-26-34.744739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0014681208053691276, "em_stderr": 0.0003921042190298278, "f1": 0.05842386744966444, "f1_stderr": 0.0013305449660371358, "acc": 0.43888155205954144, "acc_stderr": 0.01031967359624197 }, "harness|drop|3": { "em": 0.0014681208053691276, "em_stderr": 0.0003921042190298278, "f1": 0.05842386744966444, "f1_stderr": 0.0013305449660371358 }, "harness|gsm8k|5": { "acc": 0.11296436694465505, "acc_stderr": 0.008719339028833059 }, "harness|winogrande|5": { "acc": 0.7647987371744278, "acc_stderr": 0.011920008163650882 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b
[ "region:us" ]
2023-08-17T23:06:00+00:00
{"pretty_name": "Evaluation run of shareAI/bimoGPT-llama2-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [shareAI/bimoGPT-llama2-13b](https://huggingface.co/shareAI/bimoGPT-llama2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T02:26:34.744739](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b/blob/main/results_2023-09-17T02-26-34.744739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298278,\n \"f1\": 0.05842386744966444,\n \"f1_stderr\": 0.0013305449660371358,\n \"acc\": 0.43888155205954144,\n \"acc_stderr\": 0.01031967359624197\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298278,\n \"f1\": 0.05842386744966444,\n \"f1_stderr\": 0.0013305449660371358\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \"acc_stderr\": 0.008719339028833059\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650882\n }\n}\n```", "repo_url": "https://huggingface.co/shareAI/bimoGPT-llama2-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|arc:challenge|25_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T02_26_34.744739", "path": ["**/details_harness|drop|3_2023-09-17T02-26-34.744739.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T02-26-34.744739.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T02_26_34.744739", "path": ["**/details_harness|gsm8k|5_2023-09-17T02-26-34.744739.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T02-26-34.744739.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hellaswag|10_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T18:04:32.310000.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T18:04:32.310000.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T02_26_34.744739", "path": ["**/details_harness|winogrande|5_2023-09-17T02-26-34.744739.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T02-26-34.744739.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T18_04_32.310000", "path": ["results_2023-08-09T18:04:32.310000.parquet"]}, {"split": "2023_09_17T02_26_34.744739", "path": ["results_2023-09-17T02-26-34.744739.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T02-26-34.744739.parquet"]}]}]}
2023-09-17T01:26:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of shareAI/bimoGPT-llama2-13b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model shareAI/bimoGPT-llama2-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T02:26:34.744739(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of shareAI/bimoGPT-llama2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model shareAI/bimoGPT-llama2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T02:26:34.744739(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of shareAI/bimoGPT-llama2-13b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model shareAI/bimoGPT-llama2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T02:26:34.744739(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shareAI/bimoGPT-llama2-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model shareAI/bimoGPT-llama2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T02:26:34.744739(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2e484a794e08917a5648886ddc782fd41292efde
# Dataset Card for Evaluation run of shareAI/llama2-13b-Chinese-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/shareAI/llama2-13b-Chinese-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [shareAI/llama2-13b-Chinese-chat](https://huggingface.co/shareAI/llama2-13b-Chinese-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T14:15:31.238109](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat/blob/main/results_2023-09-22T14-15-31.238109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788268803, "f1": 0.062396182885906, "f1_stderr": 0.0013783953134948932, "acc": 0.4400498930990388, "acc_stderr": 0.010318502304108787 }, "harness|drop|3": { "em": 0.0016778523489932886, "em_stderr": 0.00041913301788268803, "f1": 0.062396182885906, "f1_stderr": 0.0013783953134948932 }, "harness|gsm8k|5": { "acc": 0.11372251705837756, "acc_stderr": 0.008744810131034052 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183524 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat
[ "region:us" ]
2023-08-17T23:06:10+00:00
{"pretty_name": "Evaluation run of shareAI/llama2-13b-Chinese-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [shareAI/llama2-13b-Chinese-chat](https://huggingface.co/shareAI/llama2-13b-Chinese-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T14:15:31.238109](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__llama2-13b-Chinese-chat/blob/main/results_2023-09-22T14-15-31.238109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268803,\n \"f1\": 0.062396182885906,\n \"f1_stderr\": 0.0013783953134948932,\n \"acc\": 0.4400498930990388,\n \"acc_stderr\": 0.010318502304108787\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268803,\n \"f1\": 0.062396182885906,\n \"f1_stderr\": 0.0013783953134948932\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \"acc_stderr\": 0.008744810131034052\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n }\n}\n```", "repo_url": "https://huggingface.co/shareAI/llama2-13b-Chinese-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T14_15_31.238109", "path": ["**/details_harness|drop|3_2023-09-22T14-15-31.238109.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T14-15-31.238109.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T14_15_31.238109", "path": ["**/details_harness|gsm8k|5_2023-09-22T14-15-31.238109.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T14-15-31.238109.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:02:56.948315.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:02:56.948315.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T14_15_31.238109", "path": ["**/details_harness|winogrande|5_2023-09-22T14-15-31.238109.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T14-15-31.238109.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T17_02_56.948315", "path": ["results_2023-08-09T17:02:56.948315.parquet"]}, {"split": "2023_09_22T14_15_31.238109", "path": ["results_2023-09-22T14-15-31.238109.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T14-15-31.238109.parquet"]}]}]}
2023-09-22T13:15:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of shareAI/llama2-13b-Chinese-chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model shareAI/llama2-13b-Chinese-chat on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T14:15:31.238109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of shareAI/llama2-13b-Chinese-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model shareAI/llama2-13b-Chinese-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T14:15:31.238109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of shareAI/llama2-13b-Chinese-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model shareAI/llama2-13b-Chinese-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T14:15:31.238109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shareAI/llama2-13b-Chinese-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model shareAI/llama2-13b-Chinese-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T14:15:31.238109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
07fba3bddd54fef0cb93b5e3e89fdb87d8d782a7
SDXL_REGULARIZATION_IMAGES Dataset v1 Prompt: Beautiful girl Negative Prompt: child Resolution: (1024, 1024) Base Model: sd_xl_base_1.0_0.9vae.safetensors, Refiner Model: sd_xl_refiner_1.0_0.9vae.safetensors LoRA [sd_xl_offset_example-lora_1.0.safetensors] weight: 0.5 More Datasets will be added in future, Show your support by clicking like
Unknown-User/SDXL_REGULARIZATION_IMAGES
[ "license:openrail", "region:us" ]
2023-08-17T23:06:12+00:00
{"license": "openrail"}
2023-08-18T12:32:39+00:00
[]
[]
TAGS #license-openrail #region-us
SDXL_REGULARIZATION_IMAGES Dataset v1 Prompt: Beautiful girl Negative Prompt: child Resolution: (1024, 1024) Base Model: sd_xl_base_1.0_0.9vae.safetensors, Refiner Model: sd_xl_refiner_1.0_0.9vae.safetensors LoRA [sd_xl_offset_example-lora_1.0.safetensors] weight: 0.5 More Datasets will be added in future, Show your support by clicking like
[]
[ "TAGS\n#license-openrail #region-us \n" ]
[ 12 ]
[ "passage: TAGS\n#license-openrail #region-us \n" ]
99fec25e465ccd4bfa69b3b3600dd138317e5ac1
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-1.3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PygmalionAI/pygmalion-1.3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-1.3b](https://huggingface.co/PygmalionAI/pygmalion-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PygmalionAI__pygmalion-1.3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-15T07:13:21.177207](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-1.3b/blob/main/results_2023-10-15T07-13-21.177207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.02946728187919463, "em_stderr": 0.0017318679706719317, "f1": 0.06647021812080542, "f1_stderr": 0.002046982940584873, "acc": 0.250197316495659, "acc_stderr": 0.007026240653024758 }, "harness|drop|3": { "em": 0.02946728187919463, "em_stderr": 0.0017318679706719317, "f1": 0.06647021812080542, "f1_stderr": 0.002046982940584873 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.500394632991318, "acc_stderr": 0.014052481306049516 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PygmalionAI__pygmalion-1.3b
[ "region:us" ]
2023-08-17T23:06:19+00:00
{"pretty_name": "Evaluation run of PygmalionAI/pygmalion-1.3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-1.3b](https://huggingface.co/PygmalionAI/pygmalion-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__pygmalion-1.3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T07:13:21.177207](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-1.3b/blob/main/results_2023-10-15T07-13-21.177207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02946728187919463,\n \"em_stderr\": 0.0017318679706719317,\n \"f1\": 0.06647021812080542,\n \"f1_stderr\": 0.002046982940584873,\n \"acc\": 0.250197316495659,\n \"acc_stderr\": 0.007026240653024758\n },\n \"harness|drop|3\": {\n \"em\": 0.02946728187919463,\n \"em_stderr\": 0.0017318679706719317,\n \"f1\": 0.06647021812080542,\n \"f1_stderr\": 0.002046982940584873\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.500394632991318,\n \"acc_stderr\": 0.014052481306049516\n }\n}\n```", "repo_url": "https://huggingface.co/PygmalionAI/pygmalion-1.3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T07_13_21.177207", "path": ["**/details_harness|drop|3_2023-10-15T07-13-21.177207.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T07-13-21.177207.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T07_13_21.177207", "path": ["**/details_harness|gsm8k|5_2023-10-15T07-13-21.177207.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T07-13-21.177207.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:14.842065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:47:14.842065.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:47:14.842065.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T07_13_21.177207", "path": ["**/details_harness|winogrande|5_2023-10-15T07-13-21.177207.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T07-13-21.177207.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_47_14.842065", "path": ["results_2023-07-19T14:47:14.842065.parquet"]}, {"split": "2023_10_15T07_13_21.177207", "path": ["results_2023-10-15T07-13-21.177207.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T07-13-21.177207.parquet"]}]}]}
2023-10-15T06:14:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-1.3b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PygmalionAI/pygmalion-1.3b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-15T07:13:21.177207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-1.3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T07:13:21.177207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-1.3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-15T07:13:21.177207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PygmalionAI/pygmalion-1.3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T07:13:21.177207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
76f41bec0e37f72288a1a5505c502da850a741ec
# Dataset Card for Evaluation run of PygmalionAI/metharme-1.3b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PygmalionAI/metharme-1.3b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PygmalionAI/metharme-1.3b](https://huggingface.co/PygmalionAI/metharme-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PygmalionAI__metharme-1.3b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T18:39:45.920651](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__metharme-1.3b/blob/main/results_2023-09-22T18-39-45.920651.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001572986577181208, "em_stderr": 0.00040584511324177333, "f1": 0.04728187919463099, "f1_stderr": 0.0012123660755283244, "acc": 0.2859533393610357, "acc_stderr": 0.008162495625846476 }, "harness|drop|3": { "em": 0.001572986577181208, "em_stderr": 0.00040584511324177333, "f1": 0.04728187919463099, "f1_stderr": 0.0012123660755283244 }, "harness|gsm8k|5": { "acc": 0.0075815011372251705, "acc_stderr": 0.002389281512077243 }, "harness|winogrande|5": { "acc": 0.5643251775848461, "acc_stderr": 0.01393570973961571 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PygmalionAI__metharme-1.3b
[ "region:us" ]
2023-08-17T23:06:28+00:00
{"pretty_name": "Evaluation run of PygmalionAI/metharme-1.3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [PygmalionAI/metharme-1.3b](https://huggingface.co/PygmalionAI/metharme-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__metharme-1.3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T18:39:45.920651](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__metharme-1.3b/blob/main/results_2023-09-22T18-39-45.920651.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.04728187919463099,\n \"f1_stderr\": 0.0012123660755283244,\n \"acc\": 0.2859533393610357,\n \"acc_stderr\": 0.008162495625846476\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.04728187919463099,\n \"f1_stderr\": 0.0012123660755283244\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.002389281512077243\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5643251775848461,\n \"acc_stderr\": 0.01393570973961571\n }\n}\n```", "repo_url": "https://huggingface.co/PygmalionAI/metharme-1.3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T18_39_45.920651", "path": ["**/details_harness|drop|3_2023-09-22T18-39-45.920651.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T18-39-45.920651.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T18_39_45.920651", "path": ["**/details_harness|gsm8k|5_2023-09-22T18-39-45.920651.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T18-39-45.920651.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:50:43.188696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:50:43.188696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T18_39_45.920651", "path": ["**/details_harness|winogrande|5_2023-09-22T18-39-45.920651.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T18-39-45.920651.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_50_43.188696", "path": ["results_2023-07-19T14:50:43.188696.parquet"]}, {"split": "2023_09_22T18_39_45.920651", "path": ["results_2023-09-22T18-39-45.920651.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T18-39-45.920651.parquet"]}]}]}
2023-09-22T17:39:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PygmalionAI/metharme-1.3b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PygmalionAI/metharme-1.3b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T18:39:45.920651(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PygmalionAI/metharme-1.3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/metharme-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T18:39:45.920651(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PygmalionAI/metharme-1.3b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/metharme-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T18:39:45.920651(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PygmalionAI/metharme-1.3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/metharme-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T18:39:45.920651(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
586de40c9706a6c0d81a25b5cd77f116714c9d83
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-2.7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PygmalionAI/pygmalion-2.7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-2.7b](https://huggingface.co/PygmalionAI/pygmalion-2.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PygmalionAI__pygmalion-2.7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T20:17:59.683847](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2.7b/blob/main/results_2023-09-22T20-17-59.683847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.04320469798657718, "em_stderr": 0.0020821626664430564, "f1": 0.08408347315436249, "f1_stderr": 0.0023636579014392274, "acc": 0.2825572217837411, "acc_stderr": 0.006966407055209012 }, "harness|drop|3": { "em": 0.04320469798657718, "em_stderr": 0.0020821626664430564, "f1": 0.08408347315436249, "f1_stderr": 0.0023636579014392274 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5651144435674822, "acc_stderr": 0.013932814110418024 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PygmalionAI__pygmalion-2.7b
[ "region:us" ]
2023-08-17T23:06:38+00:00
{"pretty_name": "Evaluation run of PygmalionAI/pygmalion-2.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-2.7b](https://huggingface.co/PygmalionAI/pygmalion-2.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__pygmalion-2.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T20:17:59.683847](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-2.7b/blob/main/results_2023-09-22T20-17-59.683847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04320469798657718,\n \"em_stderr\": 0.0020821626664430564,\n \"f1\": 0.08408347315436249,\n \"f1_stderr\": 0.0023636579014392274,\n \"acc\": 0.2825572217837411,\n \"acc_stderr\": 0.006966407055209012\n },\n \"harness|drop|3\": {\n \"em\": 0.04320469798657718,\n \"em_stderr\": 0.0020821626664430564,\n \"f1\": 0.08408347315436249,\n \"f1_stderr\": 0.0023636579014392274\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5651144435674822,\n \"acc_stderr\": 0.013932814110418024\n }\n}\n```", "repo_url": "https://huggingface.co/PygmalionAI/pygmalion-2.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T20_17_59.683847", "path": ["**/details_harness|drop|3_2023-09-22T20-17-59.683847.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T20-17-59.683847.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T20_17_59.683847", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-17-59.683847.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-17-59.683847.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:05.422128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:36:05.422128.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:36:05.422128.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T20_17_59.683847", "path": ["**/details_harness|winogrande|5_2023-09-22T20-17-59.683847.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T20-17-59.683847.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_36_05.422128", "path": ["results_2023-07-19T16:36:05.422128.parquet"]}, {"split": "2023_09_22T20_17_59.683847", "path": ["results_2023-09-22T20-17-59.683847.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T20-17-59.683847.parquet"]}]}]}
2023-09-22T19:18:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-2.7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PygmalionAI/pygmalion-2.7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T20:17:59.683847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-2.7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T20:17:59.683847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-2.7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T20:17:59.683847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PygmalionAI/pygmalion-2.7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T20:17:59.683847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
26b8234999d51200828eeef6f9f2fd51acf06893
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-350m ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PygmalionAI/pygmalion-350m - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-350m](https://huggingface.co/PygmalionAI/pygmalion-350m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PygmalionAI__pygmalion-350m", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T15:33:29.542088](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-350m/blob/main/results_2023-10-14T15-33-29.542088.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001363255033557047, "em_stderr": 0.0003778609196460963, "f1": 0.03894609899328867, "f1_stderr": 0.0011582048286439316, "acc": 0.2540347408676421, "acc_stderr": 0.008026788466282256 }, "harness|drop|3": { "em": 0.001363255033557047, "em_stderr": 0.0003778609196460963, "f1": 0.03894609899328867, "f1_stderr": 0.0011582048286439316 }, "harness|gsm8k|5": { "acc": 0.00530705079605762, "acc_stderr": 0.002001305720948079 }, "harness|winogrande|5": { "acc": 0.5027624309392266, "acc_stderr": 0.014052271211616433 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PygmalionAI__pygmalion-350m
[ "region:us" ]
2023-08-17T23:06:46+00:00
{"pretty_name": "Evaluation run of PygmalionAI/pygmalion-350m", "dataset_summary": "Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-350m](https://huggingface.co/PygmalionAI/pygmalion-350m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__pygmalion-350m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T15:33:29.542088](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-350m/blob/main/results_2023-10-14T15-33-29.542088.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460963,\n \"f1\": 0.03894609899328867,\n \"f1_stderr\": 0.0011582048286439316,\n \"acc\": 0.2540347408676421,\n \"acc_stderr\": 0.008026788466282256\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460963,\n \"f1\": 0.03894609899328867,\n \"f1_stderr\": 0.0011582048286439316\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.002001305720948079\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616433\n }\n}\n```", "repo_url": "https://huggingface.co/PygmalionAI/pygmalion-350m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T15_33_29.542088", "path": ["**/details_harness|drop|3_2023-10-14T15-33-29.542088.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T15-33-29.542088.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T15_33_29.542088", "path": ["**/details_harness|gsm8k|5_2023-10-14T15-33-29.542088.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T15-33-29.542088.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:12.933882.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:13:12.933882.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:13:12.933882.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T15_33_29.542088", "path": ["**/details_harness|winogrande|5_2023-10-14T15-33-29.542088.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T15-33-29.542088.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_13_12.933882", "path": ["results_2023-07-19T14:13:12.933882.parquet"]}, {"split": "2023_10_14T15_33_29.542088", "path": ["results_2023-10-14T15-33-29.542088.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T15-33-29.542088.parquet"]}]}]}
2023-10-14T14:33:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-350m ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PygmalionAI/pygmalion-350m on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T15:33:29.542088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-350m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-350m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T15:33:29.542088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-350m", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-350m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T15:33:29.542088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PygmalionAI/pygmalion-350m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-350m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T15:33:29.542088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0d51c9addc722838139df810c31bbd2d6ac07841
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-6b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PygmalionAI/pygmalion-6b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-6b](https://huggingface.co/PygmalionAI/pygmalion-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PygmalionAI__pygmalion-6b", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-10-08T20:04:23.834964](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-6b/blob/main/results_2023-10-08T20-04-23.834964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26347154250909116, "acc_stderr": 0.03165492423612406, "acc_norm": 0.26689039326246145, "acc_norm_stderr": 0.03165325674877226, "mc1": 0.20195838433292534, "mc1_stderr": 0.014053957441512359, "mc2": 0.3253448533993895, "mc2_stderr": 0.013862486209403098 }, "harness|arc:challenge|25": { "acc": 0.3728668941979522, "acc_stderr": 0.014131176760131165, "acc_norm": 0.4052901023890785, "acc_norm_stderr": 0.014346869060229323 }, "harness|hellaswag|10": { "acc": 0.5053774148575981, "acc_stderr": 0.004989492828168535, "acc_norm": 0.6746664011153157, "acc_norm_stderr": 0.004675418774314239 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768081, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.25925925925925924, "acc_stderr": 0.03785714465066653, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.03785714465066653 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3092105263157895, "acc_stderr": 0.037610708698674805, "acc_norm": 0.3092105263157895, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.23773584905660378, "acc_stderr": 0.0261998088075619, "acc_norm": 0.23773584905660378, "acc_norm_stderr": 0.0261998088075619 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.20833333333333334, "acc_stderr": 0.03396116205845333, "acc_norm": 0.20833333333333334, "acc_norm_stderr": 0.03396116205845333 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.13, "acc_stderr": 0.03379976689896308, "acc_norm": 0.13, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.042295258468165065, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2543352601156069, "acc_stderr": 0.0332055644308557, "acc_norm": 0.2543352601156069, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.18627450980392157, "acc_stderr": 0.03873958714149351, "acc_norm": 0.18627450980392157, "acc_norm_stderr": 0.03873958714149351 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3404255319148936, "acc_stderr": 0.030976692998534436, "acc_norm": 0.3404255319148936, "acc_norm_stderr": 0.030976692998534436 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.040493392977481425, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.040493392977481425 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2689655172413793, "acc_stderr": 0.03695183311650232, "acc_norm": 0.2689655172413793, "acc_norm_stderr": 0.03695183311650232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.24603174603174602, "acc_stderr": 0.022182037202948368, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.022182037202948368 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23809523809523808, "acc_stderr": 0.03809523809523811, "acc_norm": 0.23809523809523808, "acc_norm_stderr": 0.03809523809523811 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.23548387096774193, "acc_stderr": 0.02413763242933771, "acc_norm": 0.23548387096774193, "acc_norm_stderr": 0.02413763242933771 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.23645320197044334, "acc_stderr": 0.029896114291733552, "acc_norm": 0.23645320197044334, "acc_norm_stderr": 0.029896114291733552 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2545454545454545, "acc_stderr": 0.03401506715249039, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.20707070707070707, "acc_stderr": 0.028869778460267042, "acc_norm": 0.20707070707070707, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.22797927461139897, "acc_stderr": 0.03027690994517826, "acc_norm": 0.22797927461139897, "acc_norm_stderr": 0.03027690994517826 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2512820512820513, "acc_stderr": 0.021992016662370526, "acc_norm": 0.2512820512820513, "acc_norm_stderr": 0.021992016662370526 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26666666666666666, "acc_stderr": 0.02696242432507383, "acc_norm": 0.26666666666666666, "acc_norm_stderr": 0.02696242432507383 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2184873949579832, "acc_stderr": 0.02684151432295894, "acc_norm": 0.2184873949579832, "acc_norm_stderr": 0.02684151432295894 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2185430463576159, "acc_stderr": 0.03374235550425694, "acc_norm": 0.2185430463576159, "acc_norm_stderr": 0.03374235550425694 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.26788990825688075, "acc_stderr": 0.018987462257978652, "acc_norm": 0.26788990825688075, "acc_norm_stderr": 0.018987462257978652 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1574074074074074, "acc_stderr": 0.02483717351824239, "acc_norm": 0.1574074074074074, "acc_norm_stderr": 0.02483717351824239 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.3088235294117647, "acc_stderr": 0.03242661719827218, "acc_norm": 0.3088235294117647, "acc_norm_stderr": 0.03242661719827218 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.2616033755274262, "acc_stderr": 0.028609516716994934, "acc_norm": 0.2616033755274262, "acc_norm_stderr": 0.028609516716994934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3542600896860987, "acc_stderr": 0.032100621541349864, "acc_norm": 0.3542600896860987, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.20610687022900764, "acc_stderr": 0.03547771004159464, "acc_norm": 0.20610687022900764, "acc_norm_stderr": 0.03547771004159464 }, "harness|hendrycksTest-international_law|5": { "acc": 0.3305785123966942, "acc_stderr": 0.04294340845212094, "acc_norm": 0.3305785123966942, "acc_norm_stderr": 0.04294340845212094 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.3425925925925926, "acc_stderr": 0.04587904741301811, "acc_norm": 0.3425925925925926, "acc_norm_stderr": 0.04587904741301811 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2331288343558282, "acc_stderr": 0.033220157957767414, "acc_norm": 0.2331288343558282, "acc_norm_stderr": 0.033220157957767414 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.26785714285714285, "acc_stderr": 0.04203277291467764, "acc_norm": 0.26785714285714285, "acc_norm_stderr": 0.04203277291467764 }, "harness|hendrycksTest-management|5": { "acc": 0.21359223300970873, "acc_stderr": 0.040580420156460344, "acc_norm": 0.21359223300970873, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.32905982905982906, "acc_stderr": 0.03078232157768816, "acc_norm": 0.32905982905982906, "acc_norm_stderr": 0.03078232157768816 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2822477650063857, "acc_stderr": 0.016095302969878555, "acc_norm": 0.2822477650063857, "acc_norm_stderr": 0.016095302969878555 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.27167630057803466, "acc_stderr": 0.023948512905468365, "acc_norm": 0.27167630057803466, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2346368715083799, "acc_stderr": 0.014173044098303667, "acc_norm": 0.2346368715083799, "acc_norm_stderr": 0.014173044098303667 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.29411764705882354, "acc_stderr": 0.026090162504279053, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.026090162504279053 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2604501607717042, "acc_stderr": 0.024926723224845557, "acc_norm": 0.2604501607717042, "acc_norm_stderr": 0.024926723224845557 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.28703703703703703, "acc_stderr": 0.025171041915309684, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.025171041915309684 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3049645390070922, "acc_stderr": 0.027464708442022128, "acc_norm": 0.3049645390070922, "acc_norm_stderr": 0.027464708442022128 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.27444589308996087, "acc_stderr": 0.011397043163078154, "acc_norm": 0.27444589308996087, "acc_norm_stderr": 0.011397043163078154 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.16911764705882354, "acc_stderr": 0.02277086801011301, "acc_norm": 0.16911764705882354, "acc_norm_stderr": 0.02277086801011301 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.27941176470588236, "acc_stderr": 0.018152871051538816, "acc_norm": 0.27941176470588236, "acc_norm_stderr": 0.018152871051538816 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3, "acc_stderr": 0.04389311454644287, "acc_norm": 0.3, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3142857142857143, "acc_stderr": 0.029719329422417465, "acc_norm": 0.3142857142857143, "acc_norm_stderr": 0.029719329422417465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.25870646766169153, "acc_stderr": 0.030965903123573037, "acc_norm": 0.25870646766169153, "acc_norm_stderr": 0.030965903123573037 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-virology|5": { "acc": 0.2710843373493976, "acc_stderr": 0.03460579907553026, "acc_norm": 0.2710843373493976, "acc_norm_stderr": 0.03460579907553026 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.2807017543859649, "acc_stderr": 0.034462962170884265, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.20195838433292534, "mc1_stderr": 0.014053957441512359, "mc2": 0.3253448533993895, "mc2_stderr": 0.013862486209403098 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PygmalionAI__pygmalion-6b
[ "region:us" ]
2023-08-17T23:06:56+00:00
{"pretty_name": "Evaluation run of PygmalionAI/pygmalion-6b", "dataset_summary": "Dataset automatically created during the evaluation run of model [PygmalionAI/pygmalion-6b](https://huggingface.co/PygmalionAI/pygmalion-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__pygmalion-6b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-08T20:04:23.834964](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__pygmalion-6b/blob/main/results_2023-10-08T20-04-23.834964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26347154250909116,\n \"acc_stderr\": 0.03165492423612406,\n \"acc_norm\": 0.26689039326246145,\n \"acc_norm_stderr\": 0.03165325674877226,\n \"mc1\": 0.20195838433292534,\n \"mc1_stderr\": 0.014053957441512359,\n \"mc2\": 0.3253448533993895,\n \"mc2_stderr\": 0.013862486209403098\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.014131176760131165,\n \"acc_norm\": 0.4052901023890785,\n \"acc_norm_stderr\": 0.014346869060229323\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5053774148575981,\n \"acc_stderr\": 0.004989492828168535,\n \"acc_norm\": 0.6746664011153157,\n \"acc_norm_stderr\": 0.004675418774314239\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.0261998088075619,\n \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.0261998088075619\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.03396116205845333,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.03396116205845333\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.13,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.13,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.030976692998534436,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.030976692998534436\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.23548387096774193,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517826,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517826\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.021992016662370526,\n \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.021992016662370526\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295894,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295894\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26788990825688075,\n \"acc_stderr\": 0.018987462257978652,\n \"acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.018987462257978652\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1574074074074074,\n \"acc_stderr\": 0.02483717351824239,\n \"acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.02483717351824239\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.04587904741301811,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.04587904741301811\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32905982905982906,\n \"acc_stderr\": 0.03078232157768816,\n \"acc_norm\": 0.32905982905982906,\n \"acc_norm_stderr\": 0.03078232157768816\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n \"acc_stderr\": 0.016095302969878555,\n \"acc_norm\": 0.2822477650063857,\n \"acc_norm_stderr\": 0.016095302969878555\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n \"acc_stderr\": 0.014173044098303667,\n \"acc_norm\": 0.2346368715083799,\n \"acc_norm_stderr\": 0.014173044098303667\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n \"acc_stderr\": 0.024926723224845557,\n \"acc_norm\": 0.2604501607717042,\n \"acc_norm_stderr\": 0.024926723224845557\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27444589308996087,\n \"acc_stderr\": 0.011397043163078154,\n \"acc_norm\": 0.27444589308996087,\n \"acc_norm_stderr\": 0.011397043163078154\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.02277086801011301,\n \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.02277086801011301\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538816,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538816\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417465,\n \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573037,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573037\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20195838433292534,\n \"mc1_stderr\": 0.014053957441512359,\n \"mc2\": 0.3253448533993895,\n \"mc2_stderr\": 0.013862486209403098\n }\n}\n```", "repo_url": "https://huggingface.co/PygmalionAI/pygmalion-6b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|arc:challenge|25_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T16_08_36.166689", "path": ["**/details_harness|drop|3_2023-09-17T16-08-36.166689.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T16-08-36.166689.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T16_08_36.166689", "path": ["**/details_harness|gsm8k|5_2023-09-17T16-08-36.166689.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T16-08-36.166689.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hellaswag|10_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:25:58.847315.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-23.834964.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-08T20-04-23.834964.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T16_08_36.166689", "path": ["**/details_harness|winogrande|5_2023-09-17T16-08-36.166689.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T16-08-36.166689.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_25_58.847315", "path": ["results_2023-07-18T11:25:58.847315.parquet"]}, {"split": "2023_09_17T16_08_36.166689", "path": ["results_2023-09-17T16-08-36.166689.parquet"]}, {"split": "2023_10_08T20_04_23.834964", "path": ["results_2023-10-08T20-04-23.834964.parquet"]}, {"split": "latest", "path": ["results_2023-10-08T20-04-23.834964.parquet"]}]}]}
2023-10-08T19:05:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PygmalionAI/pygmalion-6b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PygmalionAI/pygmalion-6b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-08T20:04:23.834964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-6b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-08T20:04:23.834964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PygmalionAI/pygmalion-6b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-08T20:04:23.834964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PygmalionAI/pygmalion-6b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PygmalionAI/pygmalion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-08T20:04:23.834964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
3241a5111f11b1e7d22e1dfce547c636b83331f3
# Dataset Card for Evaluation run of victor123/WizardLM-13B-1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/victor123/WizardLM-13B-1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [victor123/WizardLM-13B-1.0](https://huggingface.co/victor123/WizardLM-13B-1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_victor123__WizardLM-13B-1.0", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-03T00:24:56.534385](https://huggingface.co/datasets/open-llm-leaderboard/details_victor123__WizardLM-13B-1.0/blob/main/results_2023-12-03T00-24-56.534385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_victor123__WizardLM-13B-1.0
[ "region:us" ]
2023-08-17T23:07:05+00:00
{"pretty_name": "Evaluation run of victor123/WizardLM-13B-1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [victor123/WizardLM-13B-1.0](https://huggingface.co/victor123/WizardLM-13B-1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_victor123__WizardLM-13B-1.0\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T00:24:56.534385](https://huggingface.co/datasets/open-llm-leaderboard/details_victor123__WizardLM-13B-1.0/blob/main/results_2023-12-03T00-24-56.534385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/victor123/WizardLM-13B-1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T22_57_01.663121", "path": ["**/details_harness|drop|3_2023-09-18T22-57-01.663121.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T22-57-01.663121.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T22_57_01.663121", "path": ["**/details_harness|gsm8k|5_2023-09-18T22-57-01.663121.parquet"]}, {"split": "2023_12_03T00_24_56.534385", "path": ["**/details_harness|gsm8k|5_2023-12-03T00-24-56.534385.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T00-24-56.534385.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:18:26.905087.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:18:26.905087.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:18:26.905087.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T22_57_01.663121", "path": ["**/details_harness|winogrande|5_2023-09-18T22-57-01.663121.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T22-57-01.663121.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T16_18_26.905087", "path": ["results_2023-07-18T16:18:26.905087.parquet"]}, {"split": "2023_09_18T22_57_01.663121", "path": ["results_2023-09-18T22-57-01.663121.parquet"]}, {"split": "2023_12_03T00_24_56.534385", "path": ["results_2023-12-03T00-24-56.534385.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T00-24-56.534385.parquet"]}]}]}
2023-12-03T00:25:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of victor123/WizardLM-13B-1.0 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model victor123/WizardLM-13B-1.0 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-03T00:24:56.534385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of victor123/WizardLM-13B-1.0", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model victor123/WizardLM-13B-1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T00:24:56.534385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of victor123/WizardLM-13B-1.0", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model victor123/WizardLM-13B-1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T00:24:56.534385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of victor123/WizardLM-13B-1.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model victor123/WizardLM-13B-1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T00:24:56.534385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
86b09b6b3e8b03b42cf3088bd2b65dc97447f066
# Dataset Card for Evaluation run of beomi/llama-2-ko-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/beomi/llama-2-ko-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [beomi/llama-2-ko-7b](https://huggingface.co/beomi/llama-2-ko-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_beomi__llama-2-ko-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T14:17:57.880003](https://huggingface.co/datasets/open-llm-leaderboard/details_beomi__llama-2-ko-7b/blob/main/results_2023-09-17T14-17-57.880003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571043, "f1": 0.04101300335570507, "f1_stderr": 0.0009468163407656627, "acc": 0.37055050554311253, "acc_stderr": 0.008214439814114797 }, "harness|drop|3": { "em": 0.001153523489932886, "em_stderr": 0.00034761798968571043, "f1": 0.04101300335570507, "f1_stderr": 0.0009468163407656627 }, "harness|gsm8k|5": { "acc": 0.019711902956785442, "acc_stderr": 0.0038289829787357134 }, "harness|winogrande|5": { "acc": 0.7213891081294396, "acc_stderr": 0.01259989664949388 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_beomi__llama-2-ko-7b
[ "region:us" ]
2023-08-17T23:07:15+00:00
{"pretty_name": "Evaluation run of beomi/llama-2-ko-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [beomi/llama-2-ko-7b](https://huggingface.co/beomi/llama-2-ko-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beomi__llama-2-ko-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T14:17:57.880003](https://huggingface.co/datasets/open-llm-leaderboard/details_beomi__llama-2-ko-7b/blob/main/results_2023-09-17T14-17-57.880003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571043,\n \"f1\": 0.04101300335570507,\n \"f1_stderr\": 0.0009468163407656627,\n \"acc\": 0.37055050554311253,\n \"acc_stderr\": 0.008214439814114797\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571043,\n \"f1\": 0.04101300335570507,\n \"f1_stderr\": 0.0009468163407656627\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.0038289829787357134\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7213891081294396,\n \"acc_stderr\": 0.01259989664949388\n }\n}\n```", "repo_url": "https://huggingface.co/beomi/llama-2-ko-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T14_17_57.880003", "path": ["**/details_harness|drop|3_2023-09-17T14-17-57.880003.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T14-17-57.880003.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T14_17_57.880003", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-17-57.880003.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-17-57.880003.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:07:33.480523.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:07:33.480523.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:07:33.480523.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T14_17_57.880003", "path": ["**/details_harness|winogrande|5_2023-09-17T14-17-57.880003.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T14-17-57.880003.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_07_33.480523", "path": ["results_2023-07-24T11:07:33.480523.parquet"]}, {"split": "2023_09_17T14_17_57.880003", "path": ["results_2023-09-17T14-17-57.880003.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T14-17-57.880003.parquet"]}]}]}
2023-09-17T13:18:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of beomi/llama-2-ko-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model beomi/llama-2-ko-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T14:17:57.880003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of beomi/llama-2-ko-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/llama-2-ko-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T14:17:57.880003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of beomi/llama-2-ko-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/llama-2-ko-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T14:17:57.880003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beomi/llama-2-ko-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/llama-2-ko-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T14:17:57.880003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
826d713313fbd57b1c9137c63093f3b724ed5e0b
# Dataset Card for Evaluation run of beomi/KoAlpaca-Polyglot-5.8B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/beomi/KoAlpaca-Polyglot-5.8B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [beomi/KoAlpaca-Polyglot-5.8B](https://huggingface.co/beomi/KoAlpaca-Polyglot-5.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_beomi__KoAlpaca-Polyglot-5.8B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T22:10:39.400321](https://huggingface.co/datasets/open-llm-leaderboard/details_beomi__KoAlpaca-Polyglot-5.8B/blob/main/results_2023-09-22T22-10-39.400321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.01541526845637584, "em_stderr": 0.0012616582904353766, "f1": 0.054131711409395974, "f1_stderr": 0.0017182561984205931, "acc": 0.24544616266538535, "acc_stderr": 0.007403949973545061 }, "harness|drop|3": { "em": 0.01541526845637584, "em_stderr": 0.0012616582904353766, "f1": 0.054131711409395974, "f1_stderr": 0.0017182561984205931 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.0007581501137225404 }, "harness|winogrande|5": { "acc": 0.49013417521704816, "acc_stderr": 0.014049749833367582 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_beomi__KoAlpaca-Polyglot-5.8B
[ "region:us" ]
2023-08-17T23:07:24+00:00
{"pretty_name": "Evaluation run of beomi/KoAlpaca-Polyglot-5.8B", "dataset_summary": "Dataset automatically created during the evaluation run of model [beomi/KoAlpaca-Polyglot-5.8B](https://huggingface.co/beomi/KoAlpaca-Polyglot-5.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beomi__KoAlpaca-Polyglot-5.8B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T22:10:39.400321](https://huggingface.co/datasets/open-llm-leaderboard/details_beomi__KoAlpaca-Polyglot-5.8B/blob/main/results_2023-09-22T22-10-39.400321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01541526845637584,\n \"em_stderr\": 0.0012616582904353766,\n \"f1\": 0.054131711409395974,\n \"f1_stderr\": 0.0017182561984205931,\n \"acc\": 0.24544616266538535,\n \"acc_stderr\": 0.007403949973545061\n },\n \"harness|drop|3\": {\n \"em\": 0.01541526845637584,\n \"em_stderr\": 0.0012616582904353766,\n \"f1\": 0.054131711409395974,\n \"f1_stderr\": 0.0017182561984205931\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225404\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.49013417521704816,\n \"acc_stderr\": 0.014049749833367582\n }\n}\n```", "repo_url": "https://huggingface.co/beomi/KoAlpaca-Polyglot-5.8B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T10_40_00.706474", "path": ["**/details_harness|drop|3_2023-09-17T10-40-00.706474.parquet"]}, {"split": "2023_09_22T22_10_39.400321", "path": ["**/details_harness|drop|3_2023-09-22T22-10-39.400321.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T22-10-39.400321.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T10_40_00.706474", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-40-00.706474.parquet"]}, {"split": "2023_09_22T22_10_39.400321", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-10-39.400321.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-10-39.400321.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:52:43.613378.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:52:43.613378.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:52:43.613378.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T10_40_00.706474", "path": ["**/details_harness|winogrande|5_2023-09-17T10-40-00.706474.parquet"]}, {"split": "2023_09_22T22_10_39.400321", "path": ["**/details_harness|winogrande|5_2023-09-22T22-10-39.400321.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T22-10-39.400321.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T12_52_43.613378", "path": ["results_2023-07-18T12:52:43.613378.parquet"]}, {"split": "2023_09_17T10_40_00.706474", "path": ["results_2023-09-17T10-40-00.706474.parquet"]}, {"split": "2023_09_22T22_10_39.400321", "path": ["results_2023-09-22T22-10-39.400321.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T22-10-39.400321.parquet"]}]}]}
2023-09-22T21:10:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of beomi/KoAlpaca-Polyglot-5.8B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model beomi/KoAlpaca-Polyglot-5.8B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T22:10:39.400321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of beomi/KoAlpaca-Polyglot-5.8B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/KoAlpaca-Polyglot-5.8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T22:10:39.400321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of beomi/KoAlpaca-Polyglot-5.8B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/KoAlpaca-Polyglot-5.8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T22:10:39.400321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beomi/KoAlpaca-Polyglot-5.8B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model beomi/KoAlpaca-Polyglot-5.8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T22:10:39.400321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
dfe6cb7418e78f9aca408a26c24fad2fe01f7fca
# Dataset Card for Evaluation run of keyfan/vicuna-chinese-replication-v1.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/keyfan/vicuna-chinese-replication-v1.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [keyfan/vicuna-chinese-replication-v1.1](https://huggingface.co/keyfan/vicuna-chinese-replication-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_keyfan__vicuna-chinese-replication-v1.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-20T16:29:17.450088](https://huggingface.co/datasets/open-llm-leaderboard/details_keyfan__vicuna-chinese-replication-v1.1/blob/main/results_2023-09-20T16-29-17.450088.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.19274328859060402, "em_stderr": 0.004039569791455342, "f1": 0.2668655620805379, "f1_stderr": 0.004116773539445767, "acc": 0.3844009566932927, "acc_stderr": 0.0106207870984688 }, "harness|drop|3": { "em": 0.19274328859060402, "em_stderr": 0.004039569791455342, "f1": 0.2668655620805379, "f1_stderr": 0.004116773539445767 }, "harness|gsm8k|5": { "acc": 0.09476876421531463, "acc_stderr": 0.008067791560015412 }, "harness|winogrande|5": { "acc": 0.6740331491712708, "acc_stderr": 0.013173782636922189 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_keyfan__vicuna-chinese-replication-v1.1
[ "region:us" ]
2023-08-17T23:07:33+00:00
{"pretty_name": "Evaluation run of keyfan/vicuna-chinese-replication-v1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [keyfan/vicuna-chinese-replication-v1.1](https://huggingface.co/keyfan/vicuna-chinese-replication-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_keyfan__vicuna-chinese-replication-v1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-20T16:29:17.450088](https://huggingface.co/datasets/open-llm-leaderboard/details_keyfan__vicuna-chinese-replication-v1.1/blob/main/results_2023-09-20T16-29-17.450088.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19274328859060402,\n \"em_stderr\": 0.004039569791455342,\n \"f1\": 0.2668655620805379,\n \"f1_stderr\": 0.004116773539445767,\n \"acc\": 0.3844009566932927,\n \"acc_stderr\": 0.0106207870984688\n },\n \"harness|drop|3\": {\n \"em\": 0.19274328859060402,\n \"em_stderr\": 0.004039569791455342,\n \"f1\": 0.2668655620805379,\n \"f1_stderr\": 0.004116773539445767\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09476876421531463,\n \"acc_stderr\": 0.008067791560015412\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6740331491712708,\n \"acc_stderr\": 0.013173782636922189\n }\n}\n```", "repo_url": "https://huggingface.co/keyfan/vicuna-chinese-replication-v1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_20T16_29_17.450088", "path": ["**/details_harness|drop|3_2023-09-20T16-29-17.450088.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-20T16-29-17.450088.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_20T16_29_17.450088", "path": ["**/details_harness|gsm8k|5_2023-09-20T16-29-17.450088.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-20T16-29-17.450088.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:34:51.648519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:34:51.648519.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:34:51.648519.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_20T16_29_17.450088", "path": ["**/details_harness|winogrande|5_2023-09-20T16-29-17.450088.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-20T16-29-17.450088.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T15_34_51.648519", "path": ["results_2023-07-24T15:34:51.648519.parquet"]}, {"split": "2023_09_20T16_29_17.450088", "path": ["results_2023-09-20T16-29-17.450088.parquet"]}, {"split": "latest", "path": ["results_2023-09-20T16-29-17.450088.parquet"]}]}]}
2023-09-20T15:29:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of keyfan/vicuna-chinese-replication-v1.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model keyfan/vicuna-chinese-replication-v1.1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-20T16:29:17.450088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of keyfan/vicuna-chinese-replication-v1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model keyfan/vicuna-chinese-replication-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-20T16:29:17.450088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of keyfan/vicuna-chinese-replication-v1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model keyfan/vicuna-chinese-replication-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-20T16:29:17.450088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of keyfan/vicuna-chinese-replication-v1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model keyfan/vicuna-chinese-replication-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-20T16:29:17.450088(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
01e2fa822e018e0312d16a8fc5d97f7cd9777ce4
# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/garage-bAInd/Camel-Platypus2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [garage-bAInd/Camel-Platypus2-13B](https://huggingface.co/garage-bAInd/Camel-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T04:35:13.977731](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B/blob/main/results_2023-10-13T04-35-13.977731.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.3248741610738255, "em_stderr": 0.004796115152921962, "f1": 0.38906250000000175, "f1_stderr": 0.004663274154133875, "acc": 0.37725358176562207, "acc_stderr": 0.006433257710580032 }, "harness|drop|3": { "em": 0.3248741610738255, "em_stderr": 0.004796115152921962, "f1": 0.38906250000000175, "f1_stderr": 0.004663274154133875 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.0007581501137225365 }, "harness|winogrande|5": { "acc": 0.7537490134175217, "acc_stderr": 0.012108365307437528 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B
[ "region:us" ]
2023-08-17T23:07:42+00:00
{"pretty_name": "Evaluation run of garage-bAInd/Camel-Platypus2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [garage-bAInd/Camel-Platypus2-13B](https://huggingface.co/garage-bAInd/Camel-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T04:35:13.977731](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B/blob/main/results_2023-10-13T04-35-13.977731.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3248741610738255,\n \"em_stderr\": 0.004796115152921962,\n \"f1\": 0.38906250000000175,\n \"f1_stderr\": 0.004663274154133875,\n \"acc\": 0.37725358176562207,\n \"acc_stderr\": 0.006433257710580032\n },\n \"harness|drop|3\": {\n \"em\": 0.3248741610738255,\n \"em_stderr\": 0.004796115152921962,\n \"f1\": 0.38906250000000175,\n \"f1_stderr\": 0.004663274154133875\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225365\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437528\n }\n}\n```", "repo_url": "https://huggingface.co/garage-bAInd/Camel-Platypus2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|arc:challenge|25_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T04_35_13.977731", "path": ["**/details_harness|drop|3_2023-10-13T04-35-13.977731.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T04-35-13.977731.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T04_35_13.977731", "path": ["**/details_harness|gsm8k|5_2023-10-13T04-35-13.977731.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T04-35-13.977731.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hellaswag|10_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T16:10:57.360881.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T16:10:57.360881.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T04_35_13.977731", "path": ["**/details_harness|winogrande|5_2023-10-13T04-35-13.977731.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T04-35-13.977731.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T16_10_57.360881", "path": ["results_2023-08-09T16:10:57.360881.parquet"]}, {"split": "2023_10_13T04_35_13.977731", "path": ["results_2023-10-13T04-35-13.977731.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T04-35-13.977731.parquet"]}]}]}
2023-10-13T03:35:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model garage-bAInd/Camel-Platypus2-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T04:35:13.977731(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Camel-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T04:35:13.977731(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Camel-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T04:35:13.977731(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Camel-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T04:35:13.977731(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7baca97b8eed4c852da464970595e6be15a941c4
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/garage-bAInd/Platypus2-70B-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-70B-instruct](https://huggingface.co/garage-bAInd/Platypus2-70B-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-09T00:36:31.182871](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct_public/blob/main/results_2023-11-09T00-36-31.182871.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4080327181208054, "em_stderr": 0.0050331050783076585, "f1": 0.5241086409395995, "f1_stderr": 0.004559323839567607, "acc": 0.616380530322115, "acc_stderr": 0.012075906712216984 }, "harness|drop|3": { "em": 0.4080327181208054, "em_stderr": 0.0050331050783076585, "f1": 0.5241086409395995, "f1_stderr": 0.004559323839567607 }, "harness|gsm8k|5": { "acc": 0.40561031084154664, "acc_stderr": 0.013524848894462104 }, "harness|winogrande|5": { "acc": 0.8271507498026835, "acc_stderr": 0.010626964529971862 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct
[ "region:us" ]
2023-08-17T23:07:51+00:00
{"pretty_name": "Evaluation run of garage-bAInd/Platypus2-70B-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-70B-instruct](https://huggingface.co/garage-bAInd/Platypus2-70B-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-09T00:36:31.182871](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct_public/blob/main/results_2023-11-09T00-36-31.182871.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4080327181208054,\n \"em_stderr\": 0.0050331050783076585,\n \"f1\": 0.5241086409395995,\n \"f1_stderr\": 0.004559323839567607,\n \"acc\": 0.616380530322115,\n \"acc_stderr\": 0.012075906712216984\n },\n \"harness|drop|3\": {\n \"em\": 0.4080327181208054,\n \"em_stderr\": 0.0050331050783076585,\n \"f1\": 0.5241086409395995,\n \"f1_stderr\": 0.004559323839567607\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40561031084154664,\n \"acc_stderr\": 0.013524848894462104\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971862\n }\n}\n```", "repo_url": "https://huggingface.co/garage-bAInd/Platypus2-70B-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_09T00_36_31.182871", "path": ["**/details_harness|drop|3_2023-11-09T00-36-31.182871.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-09T00-36-31.182871.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_09T00_36_31.182871", "path": ["**/details_harness|gsm8k|5_2023-11-09T00-36-31.182871.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-09T00-36-31.182871.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_09T00_36_31.182871", "path": ["**/details_harness|winogrande|5_2023-11-09T00-36-31.182871.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-09T00-36-31.182871.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_09T00_36_31.182871", "path": ["results_2023-11-09T00-36-31.182871.parquet"]}, {"split": "latest", "path": ["results_2023-11-09T00-36-31.182871.parquet"]}]}]}
2023-12-01T14:53:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B-instruct on the Open LLM Leaderboard. The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-11-09T00:36:31.182871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-09T00:36:31.182871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-09T00:36:31.182871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-09T00:36:31.182871(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9f7d729af148784a485353c5bc7bd83336b1a9f6
# Dataset Card for Evaluation run of garage-bAInd/Dolphin-Platypus2-70B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/garage-bAInd/Dolphin-Platypus2-70B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [garage-bAInd/Dolphin-Platypus2-70B](https://huggingface.co/garage-bAInd/Dolphin-Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Dolphin-Platypus2-70B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-08-10T02:32:56.587713](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Dolphin-Platypus2-70B/blob/main/results_2023-08-10T02%3A32%3A56.587713.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6895249670105975, "acc_stderr": 0.031417385723151066, "acc_norm": 0.6936032221534247, "acc_norm_stderr": 0.031387123187245417, "mc1": 0.397796817625459, "mc1_stderr": 0.017133934248559635, "mc2": 0.566489803511904, "mc2_stderr": 0.014977450728482283 }, "harness|arc:challenge|25": { "acc": 0.6629692832764505, "acc_stderr": 0.01381347665290228, "acc_norm": 0.7039249146757679, "acc_norm_stderr": 0.013340916085246261 }, "harness|hellaswag|10": { "acc": 0.6672973511252739, "acc_stderr": 0.0047021810422158885, "acc_norm": 0.8669587731527584, "acc_norm_stderr": 0.0033892519914384936 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7631578947368421, "acc_stderr": 0.03459777606810535, "acc_norm": 0.7631578947368421, "acc_norm_stderr": 0.03459777606810535 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8263888888888888, "acc_stderr": 0.03167473383795718, "acc_norm": 0.8263888888888888, "acc_norm_stderr": 0.03167473383795718 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.046550104113196177, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.046550104113196177 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909281, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6382978723404256, "acc_stderr": 0.0314108219759624, "acc_norm": 0.6382978723404256, "acc_norm_stderr": 0.0314108219759624 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4576719576719577, "acc_stderr": 0.025658868862058325, "acc_norm": 0.4576719576719577, "acc_norm_stderr": 0.025658868862058325 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5158730158730159, "acc_stderr": 0.044698818540726076, "acc_norm": 0.5158730158730159, "acc_norm_stderr": 0.044698818540726076 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8, "acc_stderr": 0.02275520495954294, "acc_norm": 0.8, "acc_norm_stderr": 0.02275520495954294 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5320197044334976, "acc_stderr": 0.035107665979592154, "acc_norm": 0.5320197044334976, "acc_norm_stderr": 0.035107665979592154 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066573, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066573 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8535353535353535, "acc_stderr": 0.025190921114603918, "acc_norm": 0.8535353535353535, "acc_norm_stderr": 0.025190921114603918 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7051282051282052, "acc_stderr": 0.023119362758232294, "acc_norm": 0.7051282051282052, "acc_norm_stderr": 0.023119362758232294 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114986, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114986 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7605042016806722, "acc_stderr": 0.027722065493361262, "acc_norm": 0.7605042016806722, "acc_norm_stderr": 0.027722065493361262 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.48344370860927155, "acc_stderr": 0.040802441856289715, "acc_norm": 0.48344370860927155, "acc_norm_stderr": 0.040802441856289715 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8844036697247707, "acc_stderr": 0.01370874953417264, "acc_norm": 0.8844036697247707, "acc_norm_stderr": 0.01370874953417264 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5879629629629629, "acc_stderr": 0.03356787758160831, "acc_norm": 0.5879629629629629, "acc_norm_stderr": 0.03356787758160831 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9068627450980392, "acc_stderr": 0.020397853969427, "acc_norm": 0.9068627450980392, "acc_norm_stderr": 0.020397853969427 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.890295358649789, "acc_stderr": 0.020343400734868837, "acc_norm": 0.890295358649789, "acc_norm_stderr": 0.020343400734868837 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7802690582959642, "acc_stderr": 0.027790177064383595, "acc_norm": 0.7802690582959642, "acc_norm_stderr": 0.027790177064383595 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.816793893129771, "acc_stderr": 0.03392770926494733, "acc_norm": 0.816793893129771, "acc_norm_stderr": 0.03392770926494733 }, "harness|hendrycksTest-international_law|5": { "acc": 0.859504132231405, "acc_stderr": 0.03172233426002158, "acc_norm": 0.859504132231405, "acc_norm_stderr": 0.03172233426002158 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8240740740740741, "acc_stderr": 0.036809181416738807, "acc_norm": 0.8240740740740741, "acc_norm_stderr": 0.036809181416738807 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5803571428571429, "acc_stderr": 0.04684099321077106, "acc_norm": 0.5803571428571429, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.037601780060266196, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.037601780060266196 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9145299145299145, "acc_stderr": 0.01831589168562585, "acc_norm": 0.9145299145299145, "acc_norm_stderr": 0.01831589168562585 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8633461047254151, "acc_stderr": 0.012282876868629234, "acc_norm": 0.8633461047254151, "acc_norm_stderr": 0.012282876868629234 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044287, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6312849162011173, "acc_stderr": 0.01613575901503012, "acc_norm": 0.6312849162011173, "acc_norm_stderr": 0.01613575901503012 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.024848018263875195, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.024848018263875195 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7813504823151125, "acc_stderr": 0.023475581417861113, "acc_norm": 0.7813504823151125, "acc_norm_stderr": 0.023475581417861113 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.845679012345679, "acc_stderr": 0.020100830999850994, "acc_norm": 0.845679012345679, "acc_norm_stderr": 0.020100830999850994 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5567375886524822, "acc_stderr": 0.029634838473766006, "acc_norm": 0.5567375886524822, "acc_norm_stderr": 0.029634838473766006 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5691003911342895, "acc_stderr": 0.012647695889547226, "acc_norm": 0.5691003911342895, "acc_norm_stderr": 0.012647695889547226 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7169117647058824, "acc_stderr": 0.02736586113151381, "acc_norm": 0.7169117647058824, "acc_norm_stderr": 0.02736586113151381 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7450980392156863, "acc_stderr": 0.01763082737514838, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.01763082737514838 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.04309118709946458, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.04309118709946458 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7714285714285715, "acc_stderr": 0.02688214492230774, "acc_norm": 0.7714285714285715, "acc_norm_stderr": 0.02688214492230774 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8805970149253731, "acc_stderr": 0.02292879327721974, "acc_norm": 0.8805970149253731, "acc_norm_stderr": 0.02292879327721974 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.02753912288906145, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.02753912288906145 }, "harness|truthfulqa:mc|0": { "mc1": 0.397796817625459, "mc1_stderr": 0.017133934248559635, "mc2": 0.566489803511904, "mc2_stderr": 0.014977450728482283 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_garage-bAInd__Dolphin-Platypus2-70B
[ "region:us" ]
2023-08-17T23:07:59+00:00
{"pretty_name": "Evaluation run of garage-bAInd/Dolphin-Platypus2-70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [garage-bAInd/Dolphin-Platypus2-70B](https://huggingface.co/garage-bAInd/Dolphin-Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Dolphin-Platypus2-70B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-10T02:32:56.587713](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Dolphin-Platypus2-70B/blob/main/results_2023-08-10T02%3A32%3A56.587713.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6895249670105975,\n \"acc_stderr\": 0.031417385723151066,\n \"acc_norm\": 0.6936032221534247,\n \"acc_norm_stderr\": 0.031387123187245417,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.566489803511904,\n \"mc2_stderr\": 0.014977450728482283\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.01381347665290228,\n \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.013340916085246261\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6672973511252739,\n \"acc_stderr\": 0.0047021810422158885,\n \"acc_norm\": 0.8669587731527584,\n \"acc_norm_stderr\": 0.0033892519914384936\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.0314108219759624,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.0314108219759624\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4576719576719577,\n \"acc_stderr\": 0.025658868862058325,\n \"acc_norm\": 0.4576719576719577,\n \"acc_norm_stderr\": 0.025658868862058325\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.023119362758232294,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.023119362758232294\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361262,\n \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361262\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289715,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289715\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868837,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868837\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n \"acc_stderr\": 0.012282876868629234,\n \"acc_norm\": 0.8633461047254151,\n \"acc_norm_stderr\": 0.012282876868629234\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6312849162011173,\n \"acc_stderr\": 0.01613575901503012,\n \"acc_norm\": 0.6312849162011173,\n \"acc_norm_stderr\": 0.01613575901503012\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.023475581417861113,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.023475581417861113\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5691003911342895,\n \"acc_stderr\": 0.012647695889547226,\n \"acc_norm\": 0.5691003911342895,\n \"acc_norm_stderr\": 0.012647695889547226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.01763082737514838,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.01763082737514838\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.566489803511904,\n \"mc2_stderr\": 0.014977450728482283\n }\n}\n```", "repo_url": "https://huggingface.co/garage-bAInd/Dolphin-Platypus2-70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|arc:challenge|25_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hellaswag|10_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T02:32:56.587713.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T02:32:56.587713.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T02_32_56.587713", "path": ["results_2023-08-10T02:32:56.587713.parquet"]}, {"split": "latest", "path": ["results_2023-08-10T02:32:56.587713.parquet"]}]}]}
2023-08-27T11:27:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of garage-bAInd/Dolphin-Platypus2-70B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model garage-bAInd/Dolphin-Platypus2-70B on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-08-10T02:32:56.587713 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of garage-bAInd/Dolphin-Platypus2-70B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Dolphin-Platypus2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-10T02:32:56.587713 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of garage-bAInd/Dolphin-Platypus2-70B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Dolphin-Platypus2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-08-10T02:32:56.587713 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of garage-bAInd/Dolphin-Platypus2-70B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Dolphin-Platypus2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-10T02:32:56.587713 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f4c38aee4daadaa606c7c65f77cff544d7f8af36
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/garage-bAInd/Platypus2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-13B](https://huggingface.co/garage-bAInd/Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Platypus2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T01:36:13.109840](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-13B/blob/main/results_2023-09-18T01-36-13.109840.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0065016778523489934, "em_stderr": 0.0008230684297224003, "f1": 0.06950713087248322, "f1_stderr": 0.001573785110075933, "acc": 0.4196265138319013, "acc_stderr": 0.009450791969417059 }, "harness|drop|3": { "em": 0.0065016778523489934, "em_stderr": 0.0008230684297224003, "f1": 0.06950713087248322, "f1_stderr": 0.001573785110075933 }, "harness|gsm8k|5": { "acc": 0.07050796057619409, "acc_stderr": 0.007051543813983609 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.01185004012485051 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_garage-bAInd__Platypus2-13B
[ "region:us" ]
2023-08-17T23:08:08+00:00
{"pretty_name": "Evaluation run of garage-bAInd/Platypus2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-13B](https://huggingface.co/garage-bAInd/Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Platypus2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T01:36:13.109840](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-13B/blob/main/results_2023-09-18T01-36-13.109840.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0065016778523489934,\n \"em_stderr\": 0.0008230684297224003,\n \"f1\": 0.06950713087248322,\n \"f1_stderr\": 0.001573785110075933,\n \"acc\": 0.4196265138319013,\n \"acc_stderr\": 0.009450791969417059\n },\n \"harness|drop|3\": {\n \"em\": 0.0065016778523489934,\n \"em_stderr\": 0.0008230684297224003,\n \"f1\": 0.06950713087248322,\n \"f1_stderr\": 0.001573785110075933\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07050796057619409,\n \"acc_stderr\": 0.007051543813983609\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n }\n}\n```", "repo_url": "https://huggingface.co/garage-bAInd/Platypus2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T01_36_13.109840", "path": ["**/details_harness|drop|3_2023-09-18T01-36-13.109840.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T01-36-13.109840.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T01_36_13.109840", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-36-13.109840.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-36-13.109840.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:47:08.071954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:47:08.071954.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:47:08.071954.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T01_36_13.109840", "path": ["**/details_harness|winogrande|5_2023-09-18T01-36-13.109840.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T01-36-13.109840.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T15_47_08.071954", "path": ["results_2023-08-09T15:47:08.071954.parquet"]}, {"split": "2023_09_18T01_36_13.109840", "path": ["results_2023-09-18T01-36-13.109840.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T01-36-13.109840.parquet"]}]}]}
2023-09-18T00:36:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model garage-bAInd/Platypus2-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T01:36:13.109840(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of garage-bAInd/Platypus2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T01:36:13.109840(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of garage-bAInd/Platypus2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T01:36:13.109840(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of garage-bAInd/Platypus2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T01:36:13.109840(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
9c80558e69f593a3a0eab3e28de49d741a84fd4f
# Dataset Card for Evaluation run of garage-bAInd/Stable-Platypus2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/garage-bAInd/Stable-Platypus2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [garage-bAInd/Stable-Platypus2-13B](https://huggingface.co/garage-bAInd/Stable-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T23:47:31.962394](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B/blob/main/results_2023-09-17T23-47-31.962394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.37531459731543626, "em_stderr": 0.004958702554959804, "f1": 0.45221476510067204, "f1_stderr": 0.004729347386559949, "acc": 0.39347033490847444, "acc_stderr": 0.00776582600946219 }, "harness|drop|3": { "em": 0.37531459731543626, "em_stderr": 0.004958702554959804, "f1": 0.45221476510067204, "f1_stderr": 0.004729347386559949 }, "harness|gsm8k|5": { "acc": 0.01819560272934041, "acc_stderr": 0.003681611894073872 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.011850040124850508 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B
[ "region:us" ]
2023-08-17T23:08:18+00:00
{"pretty_name": "Evaluation run of garage-bAInd/Stable-Platypus2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [garage-bAInd/Stable-Platypus2-13B](https://huggingface.co/garage-bAInd/Stable-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T23:47:31.962394](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Stable-Platypus2-13B/blob/main/results_2023-09-17T23-47-31.962394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.37531459731543626,\n \"em_stderr\": 0.004958702554959804,\n \"f1\": 0.45221476510067204,\n \"f1_stderr\": 0.004729347386559949,\n \"acc\": 0.39347033490847444,\n \"acc_stderr\": 0.00776582600946219\n },\n \"harness|drop|3\": {\n \"em\": 0.37531459731543626,\n \"em_stderr\": 0.004958702554959804,\n \"f1\": 0.45221476510067204,\n \"f1_stderr\": 0.004729347386559949\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \"acc_stderr\": 0.003681611894073872\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n }\n}\n```", "repo_url": "https://huggingface.co/garage-bAInd/Stable-Platypus2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T23_47_31.962394", "path": ["**/details_harness|drop|3_2023-09-17T23-47-31.962394.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T23-47-31.962394.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T23_47_31.962394", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-47-31.962394.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-47-31.962394.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:52:34.927040.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T15:52:34.927040.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T23_47_31.962394", "path": ["**/details_harness|winogrande|5_2023-09-17T23-47-31.962394.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T23-47-31.962394.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T15_52_34.927040", "path": ["results_2023-08-09T15:52:34.927040.parquet"]}, {"split": "2023_09_17T23_47_31.962394", "path": ["results_2023-09-17T23-47-31.962394.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T23-47-31.962394.parquet"]}]}]}
2023-09-17T22:47:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of garage-bAInd/Stable-Platypus2-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model garage-bAInd/Stable-Platypus2-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T23:47:31.962394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of garage-bAInd/Stable-Platypus2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Stable-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T23:47:31.962394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of garage-bAInd/Stable-Platypus2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Stable-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T23:47:31.962394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of garage-bAInd/Stable-Platypus2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Stable-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T23:47:31.962394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d603c0b170055af013736319de2d0ee91a2aaa8f
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/garage-bAInd/Platypus2-70B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-70B](https://huggingface.co/garage-bAInd/Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Platypus2-70B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-13T01:27:19.477950](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B/blob/main/results_2023-10-13T01-27-19.477950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4649748322147651, "em_stderr": 0.005107889346229416, "f1": 0.5141369546979866, "f1_stderr": 0.004846183113432682, "acc": 0.58713939251053, "acc_stderr": 0.011581424079479265 }, "harness|drop|3": { "em": 0.4649748322147651, "em_stderr": 0.005107889346229416, "f1": 0.5141369546979866, "f1_stderr": 0.004846183113432682 }, "harness|gsm8k|5": { "acc": 0.3305534495830174, "acc_stderr": 0.012957496367085028 }, "harness|winogrande|5": { "acc": 0.8437253354380426, "acc_stderr": 0.010205351791873502 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_garage-bAInd__Platypus2-70B
[ "region:us" ]
2023-08-17T23:08:27+00:00
{"pretty_name": "Evaluation run of garage-bAInd/Platypus2-70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-70B](https://huggingface.co/garage-bAInd/Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Platypus2-70B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T01:27:19.477950](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B/blob/main/results_2023-10-13T01-27-19.477950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4649748322147651,\n \"em_stderr\": 0.005107889346229416,\n \"f1\": 0.5141369546979866,\n \"f1_stderr\": 0.004846183113432682,\n \"acc\": 0.58713939251053,\n \"acc_stderr\": 0.011581424079479265\n },\n \"harness|drop|3\": {\n \"em\": 0.4649748322147651,\n \"em_stderr\": 0.005107889346229416,\n \"f1\": 0.5141369546979866,\n \"f1_stderr\": 0.004846183113432682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3305534495830174,\n \"acc_stderr\": 0.012957496367085028\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873502\n }\n}\n```", "repo_url": "https://huggingface.co/garage-bAInd/Platypus2-70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|arc:challenge|25_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T01_27_19.477950", "path": ["**/details_harness|drop|3_2023-10-13T01-27-19.477950.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T01-27-19.477950.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T01_27_19.477950", "path": ["**/details_harness|gsm8k|5_2023-10-13T01-27-19.477950.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T01-27-19.477950.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hellaswag|10_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T02:16:23.299080.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T02:16:23.299080.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T01_27_19.477950", "path": ["**/details_harness|winogrande|5_2023-10-13T01-27-19.477950.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T01-27-19.477950.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T02_16_23.299080", "path": ["results_2023-08-10T02:16:23.299080.parquet"]}, {"split": "2023_10_13T01_27_19.477950", "path": ["results_2023-10-13T01-27-19.477950.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T01-27-19.477950.parquet"]}]}]}
2023-10-13T00:27:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-13T01:27:19.477950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T01:27:19.477950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-13T01:27:19.477950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model garage-bAInd/Platypus2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T01:27:19.477950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e79a4fed15d4e47f3cfa40b952fdc2178dc28abf
# Dataset Card for Evaluation run of microsoft/DialoGPT-large ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/microsoft/DialoGPT-large - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [microsoft/DialoGPT-large](https://huggingface.co/microsoft/DialoGPT-large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__DialoGPT-large", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-26T03:53:29.500028](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-large/blob/main/results_2023-10-26T03-53-29.500028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.005557885906040268, "em_stderr": 0.0007613497667018535, "f1": 0.005801174496644296, "f1_stderr": 0.0007683799920084722, "acc": 0.26203630623520124, "acc_stderr": 0.007018094832697566 }, "harness|drop|3": { "em": 0.005557885906040268, "em_stderr": 0.0007613497667018535, "f1": 0.005801174496644296, "f1_stderr": 0.0007683799920084722 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5240726124704025, "acc_stderr": 0.014036189665395132 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_microsoft__DialoGPT-large
[ "region:us" ]
2023-08-17T23:08:36+00:00
{"pretty_name": "Evaluation run of microsoft/DialoGPT-large", "dataset_summary": "Dataset automatically created during the evaluation run of model [microsoft/DialoGPT-large](https://huggingface.co/microsoft/DialoGPT-large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__DialoGPT-large\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-26T03:53:29.500028](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-large/blob/main/results_2023-10-26T03-53-29.500028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005557885906040268,\n \"em_stderr\": 0.0007613497667018535,\n \"f1\": 0.005801174496644296,\n \"f1_stderr\": 0.0007683799920084722,\n \"acc\": 0.26203630623520124,\n \"acc_stderr\": 0.007018094832697566\n },\n \"harness|drop|3\": {\n \"em\": 0.005557885906040268,\n \"em_stderr\": 0.0007613497667018535,\n \"f1\": 0.005801174496644296,\n \"f1_stderr\": 0.0007683799920084722\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5240726124704025,\n \"acc_stderr\": 0.014036189665395132\n }\n}\n```", "repo_url": "https://huggingface.co/microsoft/DialoGPT-large", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|arc:challenge|25_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_26T03_53_29.500028", "path": ["**/details_harness|drop|3_2023-10-26T03-53-29.500028.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-26T03-53-29.500028.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_26T03_53_29.500028", "path": ["**/details_harness|gsm8k|5_2023-10-26T03-53-29.500028.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-26T03-53-29.500028.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hellaswag|10_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T17:41:47.866293.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T17:41:47.866293.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T17:41:47.866293.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_26T03_53_29.500028", "path": ["**/details_harness|winogrande|5_2023-10-26T03-53-29.500028.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-26T03-53-29.500028.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T17_41_47.866293", "path": ["results_2023-07-18T17:41:47.866293.parquet"]}, {"split": "2023_10_26T03_53_29.500028", "path": ["results_2023-10-26T03-53-29.500028.parquet"]}, {"split": "latest", "path": ["results_2023-10-26T03-53-29.500028.parquet"]}]}]}
2023-10-26T02:53:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of microsoft/DialoGPT-large ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model microsoft/DialoGPT-large on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-26T03:53:29.500028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of microsoft/DialoGPT-large", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-large on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T03:53:29.500028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of microsoft/DialoGPT-large", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-large on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-26T03:53:29.500028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 167, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of microsoft/DialoGPT-large## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-large on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-26T03:53:29.500028(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c33494b30eb9c70afa1fd0441b0ac3a9b92361f6
# Dataset Card for Evaluation run of microsoft/DialoGPT-small ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/microsoft/DialoGPT-small - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [microsoft/DialoGPT-small](https://huggingface.co/microsoft/DialoGPT-small) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__DialoGPT-small", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-03T18:22:26.346357](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-small/blob/main/results_2023-12-03T18-22-26.346357.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_microsoft__DialoGPT-small
[ "region:us" ]
2023-08-17T23:08:46+00:00
{"pretty_name": "Evaluation run of microsoft/DialoGPT-small", "dataset_summary": "Dataset automatically created during the evaluation run of model [microsoft/DialoGPT-small](https://huggingface.co/microsoft/DialoGPT-small) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__DialoGPT-small\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T18:22:26.346357](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-small/blob/main/results_2023-12-03T18-22-26.346357.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/microsoft/DialoGPT-small", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T19_54_07.074277", "path": ["**/details_harness|drop|3_2023-10-17T19-54-07.074277.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T19-54-07.074277.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T19_54_07.074277", "path": ["**/details_harness|gsm8k|5_2023-10-17T19-54-07.074277.parquet"]}, {"split": "2023_12_03T18_22_26.346357", "path": ["**/details_harness|gsm8k|5_2023-12-03T18-22-26.346357.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T18-22-26.346357.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:58:31.382707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:58:31.382707.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:58:31.382707.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T19_54_07.074277", "path": ["**/details_harness|winogrande|5_2023-10-17T19-54-07.074277.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T19-54-07.074277.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_58_31.382707", "path": ["results_2023-07-19T18:58:31.382707.parquet"]}, {"split": "2023_10_17T19_54_07.074277", "path": ["results_2023-10-17T19-54-07.074277.parquet"]}, {"split": "2023_12_03T18_22_26.346357", "path": ["results_2023-12-03T18-22-26.346357.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T18-22-26.346357.parquet"]}]}]}
2023-12-03T18:22:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of microsoft/DialoGPT-small ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model microsoft/DialoGPT-small on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-03T18:22:26.346357(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of microsoft/DialoGPT-small", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-small on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T18:22:26.346357(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of microsoft/DialoGPT-small", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-small on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T18:22:26.346357(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of microsoft/DialoGPT-small## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-small on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T18:22:26.346357(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
017fd607d393a5e25ec335f6ed360b50484e9a1a
# Dataset Card for Evaluation run of microsoft/DialoGPT-medium ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/microsoft/DialoGPT-medium - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [microsoft/DialoGPT-medium](https://huggingface.co/microsoft/DialoGPT-medium) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__DialoGPT-medium", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-03T18:15:54.629306](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-medium/blob/main/results_2023-12-03T18-15-54.629306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_microsoft__DialoGPT-medium
[ "region:us" ]
2023-08-17T23:08:55+00:00
{"pretty_name": "Evaluation run of microsoft/DialoGPT-medium", "dataset_summary": "Dataset automatically created during the evaluation run of model [microsoft/DialoGPT-medium](https://huggingface.co/microsoft/DialoGPT-medium) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__DialoGPT-medium\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T18:15:54.629306](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-medium/blob/main/results_2023-12-03T18-15-54.629306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/microsoft/DialoGPT-medium", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T05_52_00.103585", "path": ["**/details_harness|drop|3_2023-10-18T05-52-00.103585.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T05-52-00.103585.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T05_52_00.103585", "path": ["**/details_harness|gsm8k|5_2023-10-18T05-52-00.103585.parquet"]}, {"split": "2023_12_03T18_15_54.629306", "path": ["**/details_harness|gsm8k|5_2023-12-03T18-15-54.629306.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T18-15-54.629306.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:27.633576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:21:27.633576.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:21:27.633576.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T05_52_00.103585", "path": ["**/details_harness|winogrande|5_2023-10-18T05-52-00.103585.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T05-52-00.103585.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_21_27.633576", "path": ["results_2023-07-19T19:21:27.633576.parquet"]}, {"split": "2023_10_18T05_52_00.103585", "path": ["results_2023-10-18T05-52-00.103585.parquet"]}, {"split": "2023_12_03T18_15_54.629306", "path": ["results_2023-12-03T18-15-54.629306.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T18-15-54.629306.parquet"]}]}]}
2023-12-03T18:16:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of microsoft/DialoGPT-medium ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model microsoft/DialoGPT-medium on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-03T18:15:54.629306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of microsoft/DialoGPT-medium", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T18:15:54.629306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of microsoft/DialoGPT-medium", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-03T18:15:54.629306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of microsoft/DialoGPT-medium## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/DialoGPT-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T18:15:54.629306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0e77f7b0b091e110a649ec364b3a54bae11a0568
# Dataset Card for Evaluation run of huggingtweets/jerma985 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/huggingtweets/jerma985 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [huggingtweets/jerma985](https://huggingface.co/huggingtweets/jerma985) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_huggingtweets__jerma985", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-22T15:13:39.388412](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__jerma985/blob/main/results_2023-09-22T15-13-39.388412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.014786073825503355, "em_stderr": 0.0012360366760473087, "f1": 0.0371633808724832, "f1_stderr": 0.001611424008567761, "acc": 0.2533543804262036, "acc_stderr": 0.0070256103461651745 }, "harness|drop|3": { "em": 0.014786073825503355, "em_stderr": 0.0012360366760473087, "f1": 0.0371633808724832, "f1_stderr": 0.001611424008567761 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5067087608524072, "acc_stderr": 0.014051220692330349 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_huggingtweets__jerma985
[ "region:us" ]
2023-08-17T23:09:04+00:00
{"pretty_name": "Evaluation run of huggingtweets/jerma985", "dataset_summary": "Dataset automatically created during the evaluation run of model [huggingtweets/jerma985](https://huggingface.co/huggingtweets/jerma985) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingtweets__jerma985\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T15:13:39.388412](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__jerma985/blob/main/results_2023-09-22T15-13-39.388412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760473087,\n \"f1\": 0.0371633808724832,\n \"f1_stderr\": 0.001611424008567761,\n \"acc\": 0.2533543804262036,\n \"acc_stderr\": 0.0070256103461651745\n },\n \"harness|drop|3\": {\n \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760473087,\n \"f1\": 0.0371633808724832,\n \"f1_stderr\": 0.001611424008567761\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5067087608524072,\n \"acc_stderr\": 0.014051220692330349\n }\n}\n```", "repo_url": "https://huggingface.co/huggingtweets/jerma985", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T15_13_39.388412", "path": ["**/details_harness|drop|3_2023-09-22T15-13-39.388412.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T15-13-39.388412.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T15_13_39.388412", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-13-39.388412.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-13-39.388412.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:38:23.212427.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:38:23.212427.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T15_13_39.388412", "path": ["**/details_harness|winogrande|5_2023-09-22T15-13-39.388412.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T15-13-39.388412.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T10_38_23.212427", "path": ["results_2023-07-19T10:38:23.212427.parquet"]}, {"split": "2023_09_22T15_13_39.388412", "path": ["results_2023-09-22T15-13-39.388412.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T15-13-39.388412.parquet"]}]}]}
2023-09-22T14:13:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of huggingtweets/jerma985 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model huggingtweets/jerma985 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-22T15:13:39.388412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of huggingtweets/jerma985", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingtweets/jerma985 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:13:39.388412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of huggingtweets/jerma985", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingtweets/jerma985 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-22T15:13:39.388412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of huggingtweets/jerma985## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingtweets/jerma985 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T15:13:39.388412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
52692336547abf1cc51b30464f744e3b9196b52a
# Dataset Card for Evaluation run of huggingtweets/bladeecity-jerma985 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/huggingtweets/bladeecity-jerma985 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [huggingtweets/bladeecity-jerma985](https://huggingface.co/huggingtweets/bladeecity-jerma985) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_huggingtweets__bladeecity-jerma985", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-16T19:05:34.186179](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__bladeecity-jerma985/blob/main/results_2023-09-16T19-05-34.186179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.03125, "em_stderr": 0.0017818474501498898, "f1": 0.05531879194630876, "f1_stderr": 0.0020837697984111155, "acc": 0.2600631412786109, "acc_stderr": 0.007020548332172167 }, "harness|drop|3": { "em": 0.03125, "em_stderr": 0.0017818474501498898, "f1": 0.05531879194630876, "f1_stderr": 0.0020837697984111155 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5201262825572218, "acc_stderr": 0.014041096664344334 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_huggingtweets__bladeecity-jerma985
[ "region:us" ]
2023-08-17T23:09:13+00:00
{"pretty_name": "Evaluation run of huggingtweets/bladeecity-jerma985", "dataset_summary": "Dataset automatically created during the evaluation run of model [huggingtweets/bladeecity-jerma985](https://huggingface.co/huggingtweets/bladeecity-jerma985) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingtweets__bladeecity-jerma985\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T19:05:34.186179](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__bladeecity-jerma985/blob/main/results_2023-09-16T19-05-34.186179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03125,\n \"em_stderr\": 0.0017818474501498898,\n \"f1\": 0.05531879194630876,\n \"f1_stderr\": 0.0020837697984111155,\n \"acc\": 0.2600631412786109,\n \"acc_stderr\": 0.007020548332172167\n },\n \"harness|drop|3\": {\n \"em\": 0.03125,\n \"em_stderr\": 0.0017818474501498898,\n \"f1\": 0.05531879194630876,\n \"f1_stderr\": 0.0020837697984111155\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5201262825572218,\n \"acc_stderr\": 0.014041096664344334\n }\n}\n```", "repo_url": "https://huggingface.co/huggingtweets/bladeecity-jerma985", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T19_05_34.186179", "path": ["**/details_harness|drop|3_2023-09-16T19-05-34.186179.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T19-05-34.186179.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T19_05_34.186179", "path": ["**/details_harness|gsm8k|5_2023-09-16T19-05-34.186179.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T19-05-34.186179.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:07:22.226124.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:07:22.226124.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:07:22.226124.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T19_05_34.186179", "path": ["**/details_harness|winogrande|5_2023-09-16T19-05-34.186179.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T19-05-34.186179.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_07_22.226124", "path": ["results_2023-07-19T19:07:22.226124.parquet"]}, {"split": "2023_09_16T19_05_34.186179", "path": ["results_2023-09-16T19-05-34.186179.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T19-05-34.186179.parquet"]}]}]}
2023-09-16T18:05:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of huggingtweets/bladeecity-jerma985 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model huggingtweets/bladeecity-jerma985 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-16T19:05:34.186179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of huggingtweets/bladeecity-jerma985", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingtweets/bladeecity-jerma985 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T19:05:34.186179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of huggingtweets/bladeecity-jerma985", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingtweets/bladeecity-jerma985 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-16T19:05:34.186179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of huggingtweets/bladeecity-jerma985## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingtweets/bladeecity-jerma985 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T19:05:34.186179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
91b8099a69ecb01c5caca61b43f2bc7566e3d4b0
# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [IDEA-CCNL/Ziya-LLaMA-13B-v1](https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-18T04:43:18.868497](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-v1/blob/main/results_2023-09-18T04-43-18.868497.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0, "em_stderr": 0.0, "f1": 3.145973154362416e-06, "f1_stderr": 3.145973154362522e-06, "acc": 0.2478295185477506, "acc_stderr": 0.007025978032038446 }, "harness|drop|3": { "em": 0.0, "em_stderr": 0.0, "f1": 3.145973154362416e-06, "f1_stderr": 3.145973154362522e-06 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4956590370955012, "acc_stderr": 0.014051956064076892 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-v1
[ "region:us" ]
2023-08-17T23:09:23+00:00
{"pretty_name": "Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [IDEA-CCNL/Ziya-LLaMA-13B-v1](https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T04:43:18.868497](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-v1/blob/main/results_2023-09-18T04-43-18.868497.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 3.145973154362416e-06,\n \"f1_stderr\": 3.145973154362522e-06,\n \"acc\": 0.2478295185477506,\n \"acc_stderr\": 0.007025978032038446\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 3.145973154362416e-06,\n \"f1_stderr\": 3.145973154362522e-06\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076892\n }\n}\n```", "repo_url": "https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T04_43_18.868497", "path": ["**/details_harness|drop|3_2023-09-18T04-43-18.868497.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T04-43-18.868497.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T04_43_18.868497", "path": ["**/details_harness|gsm8k|5_2023-09-18T04-43-18.868497.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T04-43-18.868497.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:24:58.972667.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:24:58.972667.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:24:58.972667.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T04_43_18.868497", "path": ["**/details_harness|winogrande|5_2023-09-18T04-43-18.868497.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T04-43-18.868497.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_24_58.972667", "path": ["results_2023-07-19T18:24:58.972667.parquet"]}, {"split": "2023_09_18T04_43_18.868497", "path": ["results_2023-09-18T04-43-18.868497.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T04-43-18.868497.parquet"]}]}]}
2023-09-18T03:43:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-v1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-18T04:43:18.868497(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T04:43:18.868497(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-18T04:43:18.868497(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 174, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T04:43:18.868497(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
11ea89dbe171a165df1416b2e10517e737e4fd1f
# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1](https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1", "harness_gsm8k_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-02T16:26:56.383238](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1/blob/main/results_2023-12-02T16-26-56.383238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1
[ "region:us" ]
2023-08-17T23:09:32+00:00
{"pretty_name": "Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1](https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-02T16:26:56.383238](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya-LLaMA-13B-Pretrain-v1/blob/main/results_2023-12-02T16-26-56.383238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T07_31_46.021134", "path": ["**/details_harness|drop|3_2023-10-13T07-31-46.021134.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T07-31-46.021134.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T07_31_46.021134", "path": ["**/details_harness|gsm8k|5_2023-10-13T07-31-46.021134.parquet"]}, {"split": "2023_12_02T16_26_56.383238", "path": ["**/details_harness|gsm8k|5_2023-12-02T16-26-56.383238.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-02T16-26-56.383238.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:27:13.663491.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:27:13.663491.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T07_31_46.021134", "path": ["**/details_harness|winogrande|5_2023-10-13T07-31-46.021134.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T07-31-46.021134.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T14_27_13.663491", "path": ["results_2023-07-18T14:27:13.663491.parquet"]}, {"split": "2023_10_13T07_31_46.021134", "path": ["results_2023-10-13T07-31-46.021134.parquet"]}, {"split": "2023_12_02T16_26_56.383238", "path": ["results_2023-12-02T16-26-56.383238.parquet"]}, {"split": "latest", "path": ["results_2023-12-02T16-26-56.383238.parquet"]}]}]}
2023-12-02T16:27:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-02T16:26:56.383238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-02T16:26:56.383238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-02T16:26:56.383238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model IDEA-CCNL/Ziya-LLaMA-13B-Pretrain-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-02T16:26:56.383238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8450c600e708bd16c033befeaca50d7e79d3f3d6
# Dataset Card for Evaluation run of Fredithefish/ScarletPajama-3B-HF ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Fredithefish/ScarletPajama-3B-HF - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Fredithefish/ScarletPajama-3B-HF](https://huggingface.co/Fredithefish/ScarletPajama-3B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T04:53:29.822366](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF/blob/main/results_2023-10-17T04-53-29.822366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.004404362416107382, "em_stderr": 0.000678145162047974, "f1": 0.05973783557047, "f1_stderr": 0.00145869394982755, "acc": 0.3235523790774504, "acc_stderr": 0.00738110264721833 }, "harness|drop|3": { "em": 0.004404362416107382, "em_stderr": 0.000678145162047974, "f1": 0.05973783557047, "f1_stderr": 0.00145869394982755 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674066 }, "harness|winogrande|5": { "acc": 0.6448303078137332, "acc_stderr": 0.013450047479569254 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF
[ "region:us" ]
2023-08-17T23:09:41+00:00
{"pretty_name": "Evaluation run of Fredithefish/ScarletPajama-3B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [Fredithefish/ScarletPajama-3B-HF](https://huggingface.co/Fredithefish/ScarletPajama-3B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T04:53:29.822366](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__ScarletPajama-3B-HF/blob/main/results_2023-10-17T04-53-29.822366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.000678145162047974,\n \"f1\": 0.05973783557047,\n \"f1_stderr\": 0.00145869394982755,\n \"acc\": 0.3235523790774504,\n \"acc_stderr\": 0.00738110264721833\n },\n \"harness|drop|3\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.000678145162047974,\n \"f1\": 0.05973783557047,\n \"f1_stderr\": 0.00145869394982755\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674066\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6448303078137332,\n \"acc_stderr\": 0.013450047479569254\n }\n}\n```", "repo_url": "https://huggingface.co/Fredithefish/ScarletPajama-3B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|arc:challenge|25_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|arc:challenge|25_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T04_53_29.822366", "path": ["**/details_harness|drop|3_2023-10-17T04-53-29.822366.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T04-53-29.822366.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T04_53_29.822366", "path": ["**/details_harness|gsm8k|5_2023-10-17T04-53-29.822366.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T04-53-29.822366.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hellaswag|10_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hellaswag|10_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T10:40:07.998848.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T10:59:29.744691.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:22:36.276384.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:22:36.276384.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T04_53_29.822366", "path": ["**/details_harness|winogrande|5_2023-10-17T04-53-29.822366.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T04-53-29.822366.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T10_40_07.998848", "path": ["results_2023-07-18T10:40:07.998848.parquet"]}, {"split": "2023_07_18T10_59_29.744691", "path": ["results_2023-07-18T10:59:29.744691.parquet"]}, {"split": "2023_07_18T11_22_36.276384", "path": ["results_2023-07-18T11:22:36.276384.parquet"]}, {"split": "2023_10_17T04_53_29.822366", "path": ["results_2023-10-17T04-53-29.822366.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T04-53-29.822366.parquet"]}]}]}
2023-10-17T03:53:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Fredithefish/ScarletPajama-3B-HF ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Fredithefish/ScarletPajama-3B-HF on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T04:53:29.822366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Fredithefish/ScarletPajama-3B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/ScarletPajama-3B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T04:53:29.822366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Fredithefish/ScarletPajama-3B-HF", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/ScarletPajama-3B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T04:53:29.822366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Fredithefish/ScarletPajama-3B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/ScarletPajama-3B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T04:53:29.822366(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cc1a7a423fdf40a50a7745efb8d870189d46cb05
# Dataset Card for Evaluation run of Fredithefish/CrimsonPajama ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Fredithefish/CrimsonPajama - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Fredithefish/CrimsonPajama](https://huggingface.co/Fredithefish/CrimsonPajama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Fredithefish__CrimsonPajama", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-17T20:55:57.055960](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__CrimsonPajama/blob/main/results_2023-10-17T20-55-57.055960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.006396812080536913, "em_stderr": 0.0008164468837432291, "f1": 0.08161598154362382, "f1_stderr": 0.0017802453361789499, "acc": 0.3286203762267581, "acc_stderr": 0.007694655126017044 }, "harness|drop|3": { "em": 0.006396812080536913, "em_stderr": 0.0008164468837432291, "f1": 0.08161598154362382, "f1_stderr": 0.0017802453361789499 }, "harness|gsm8k|5": { "acc": 0.00530705079605762, "acc_stderr": 0.002001305720948034 }, "harness|winogrande|5": { "acc": 0.6519337016574586, "acc_stderr": 0.013388004531086054 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Fredithefish__CrimsonPajama
[ "region:us" ]
2023-08-17T23:09:54+00:00
{"pretty_name": "Evaluation run of Fredithefish/CrimsonPajama", "dataset_summary": "Dataset automatically created during the evaluation run of model [Fredithefish/CrimsonPajama](https://huggingface.co/Fredithefish/CrimsonPajama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__CrimsonPajama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T20:55:57.055960](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__CrimsonPajama/blob/main/results_2023-10-17T20-55-57.055960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432291,\n \"f1\": 0.08161598154362382,\n \"f1_stderr\": 0.0017802453361789499,\n \"acc\": 0.3286203762267581,\n \"acc_stderr\": 0.007694655126017044\n },\n \"harness|drop|3\": {\n \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432291,\n \"f1\": 0.08161598154362382,\n \"f1_stderr\": 0.0017802453361789499\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.002001305720948034\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6519337016574586,\n \"acc_stderr\": 0.013388004531086054\n }\n}\n```", "repo_url": "https://huggingface.co/Fredithefish/CrimsonPajama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T20_55_57.055960", "path": ["**/details_harness|drop|3_2023-10-17T20-55-57.055960.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T20-55-57.055960.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T20_55_57.055960", "path": ["**/details_harness|gsm8k|5_2023-10-17T20-55-57.055960.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T20-55-57.055960.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:19:26.317110.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:19:26.317110.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T20_55_57.055960", "path": ["**/details_harness|winogrande|5_2023-10-17T20-55-57.055960.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T20-55-57.055960.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_19_26.317110", "path": ["results_2023-07-19T19:19:26.317110.parquet"]}, {"split": "2023_10_17T20_55_57.055960", "path": ["results_2023-10-17T20-55-57.055960.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T20-55-57.055960.parquet"]}]}]}
2023-10-17T19:56:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Fredithefish/CrimsonPajama ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Fredithefish/CrimsonPajama on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-17T20:55:57.055960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Fredithefish/CrimsonPajama", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/CrimsonPajama on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T20:55:57.055960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Fredithefish/CrimsonPajama", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/CrimsonPajama on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-17T20:55:57.055960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 18, 31, 166, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Fredithefish/CrimsonPajama## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/CrimsonPajama on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T20:55:57.055960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e58aa8ab3082464eab59b897cc4b19652e94583c
# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4](https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-28T15:50:00.560199](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4/blob/main/results_2023-09-28T15-50-00.560199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.018246644295302015, "em_stderr": 0.0013706682452812888, "f1": 0.0714765100671141, "f1_stderr": 0.0018411955158404013, "acc": 0.32543219642729987, "acc_stderr": 0.007862138879264232 }, "harness|drop|3": { "em": 0.018246644295302015, "em_stderr": 0.0013706682452812888, "f1": 0.0714765100671141, "f1_stderr": 0.0018411955158404013 }, "harness|gsm8k|5": { "acc": 0.006823351023502654, "acc_stderr": 0.0022675371022545044 }, "harness|winogrande|5": { "acc": 0.6440410418310971, "acc_stderr": 0.013456740656273959 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4
[ "region:us" ]
2023-08-17T23:10:03+00:00
{"pretty_name": "Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4", "dataset_summary": "Dataset automatically created during the evaluation run of model [Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4](https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-28T15:50:00.560199](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4/blob/main/results_2023-09-28T15-50-00.560199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018246644295302015,\n \"em_stderr\": 0.0013706682452812888,\n \"f1\": 0.0714765100671141,\n \"f1_stderr\": 0.0018411955158404013,\n \"acc\": 0.32543219642729987,\n \"acc_stderr\": 0.007862138879264232\n },\n \"harness|drop|3\": {\n \"em\": 0.018246644295302015,\n \"em_stderr\": 0.0013706682452812888,\n \"f1\": 0.0714765100671141,\n \"f1_stderr\": 0.0018411955158404013\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.0022675371022545044\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6440410418310971,\n \"acc_stderr\": 0.013456740656273959\n }\n}\n```", "repo_url": "https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_28T15_50_00.560199", "path": ["**/details_harness|drop|3_2023-09-28T15-50-00.560199.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-28T15-50-00.560199.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_28T15_50_00.560199", "path": ["**/details_harness|gsm8k|5_2023-09-28T15-50-00.560199.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-28T15-50-00.560199.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:41.742069.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:47:41.742069.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:47:41.742069.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_28T15_50_00.560199", "path": ["**/details_harness|winogrande|5_2023-09-28T15-50-00.560199.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-28T15-50-00.560199.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_47_41.742069", "path": ["results_2023-07-19T14:47:41.742069.parquet"]}, {"split": "2023_09_28T15_50_00.560199", "path": ["results_2023-09-28T15-50-00.560199.parquet"]}, {"split": "latest", "path": ["results_2023-09-28T15-50-00.560199.parquet"]}]}]}
2023-09-28T14:50:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-28T15:50:00.560199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-28T15:50:00.560199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-28T15:50:00.560199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 37, 31, 185, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-Instruction-Tuning-with-GPT-4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-28T15:50:00.560199(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
fc286076d0f876aad41f9f26d2c2311b0ccdc385
# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K](https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-ShareGPT-11K", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-12T13:53:46.457193](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-ShareGPT-11K/blob/main/results_2023-10-12T13-53-46.457193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.024643456375838927, "em_stderr": 0.0015877135903759347, "f1": 0.089086619127517, "f1_stderr": 0.0021123800816202727, "acc": 0.3191958582384948, "acc_stderr": 0.007521160091827196 }, "harness|drop|3": { "em": 0.024643456375838927, "em_stderr": 0.0015877135903759347, "f1": 0.089086619127517, "f1_stderr": 0.0021123800816202727 }, "harness|gsm8k|5": { "acc": 0.003032600454890068, "acc_stderr": 0.0015145735612245486 }, "harness|winogrande|5": { "acc": 0.6353591160220995, "acc_stderr": 0.013527746622429844 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-ShareGPT-11K
[ "region:us" ]
2023-08-17T23:10:12+00:00
{"pretty_name": "Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K", "dataset_summary": "Dataset automatically created during the evaluation run of model [Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K](https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-ShareGPT-11K\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T13:53:46.457193](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__RedPajama-INCITE-Chat-3B-ShareGPT-11K/blob/main/results_2023-10-12T13-53-46.457193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.024643456375838927,\n \"em_stderr\": 0.0015877135903759347,\n \"f1\": 0.089086619127517,\n \"f1_stderr\": 0.0021123800816202727,\n \"acc\": 0.3191958582384948,\n \"acc_stderr\": 0.007521160091827196\n },\n \"harness|drop|3\": {\n \"em\": 0.024643456375838927,\n \"em_stderr\": 0.0015877135903759347,\n \"f1\": 0.089086619127517,\n \"f1_stderr\": 0.0021123800816202727\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245486\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6353591160220995,\n \"acc_stderr\": 0.013527746622429844\n }\n}\n```", "repo_url": "https://huggingface.co/Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T13_53_46.457193", "path": ["**/details_harness|drop|3_2023-10-12T13-53-46.457193.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T13-53-46.457193.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T13_53_46.457193", "path": ["**/details_harness|gsm8k|5_2023-10-12T13-53-46.457193.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T13-53-46.457193.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:56:30.747148.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:56:30.747148.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:56:30.747148.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T13_53_46.457193", "path": ["**/details_harness|winogrande|5_2023-10-12T13-53-46.457193.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T13-53-46.457193.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_56_30.747148", "path": ["results_2023-07-19T14:56:30.747148.parquet"]}, {"split": "2023_10_12T13_53_46.457193", "path": ["results_2023-10-12T13-53-46.457193.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T13-53-46.457193.parquet"]}]}]}
2023-10-12T12:54:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-12T13:53:46.457193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T13:53:46.457193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T13:53:46.457193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/RedPajama-INCITE-Chat-3B-ShareGPT-11K on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T13:53:46.457193(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
93ca7497f78f39c8afd5dbf4b7bf80fd31d59381
# Dataset Card for Evaluation run of psyche/kogpt ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/psyche/kogpt - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [psyche/kogpt](https://huggingface.co/psyche/kogpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_psyche__kogpt", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T16:10:56.600667](https://huggingface.co/datasets/open-llm-leaderboard/details_psyche__kogpt/blob/main/results_2023-10-14T16-10-56.600667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.005138422818791947, "em_stderr": 0.000732210410279423, "f1": 0.028876887583892643, "f1_stderr": 0.0012126841041294677, "acc": 0.24546172059984214, "acc_stderr": 0.00702508504724885 }, "harness|drop|3": { "em": 0.005138422818791947, "em_stderr": 0.000732210410279423, "f1": 0.028876887583892643, "f1_stderr": 0.0012126841041294677 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.4909234411996843, "acc_stderr": 0.0140501700944977 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_psyche__kogpt
[ "region:us" ]
2023-08-17T23:10:21+00:00
{"pretty_name": "Evaluation run of psyche/kogpt", "dataset_summary": "Dataset automatically created during the evaluation run of model [psyche/kogpt](https://huggingface.co/psyche/kogpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psyche__kogpt\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T16:10:56.600667](https://huggingface.co/datasets/open-llm-leaderboard/details_psyche__kogpt/blob/main/results_2023-10-14T16-10-56.600667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005138422818791947,\n \"em_stderr\": 0.000732210410279423,\n \"f1\": 0.028876887583892643,\n \"f1_stderr\": 0.0012126841041294677,\n \"acc\": 0.24546172059984214,\n \"acc_stderr\": 0.00702508504724885\n },\n \"harness|drop|3\": {\n \"em\": 0.005138422818791947,\n \"em_stderr\": 0.000732210410279423,\n \"f1\": 0.028876887583892643,\n \"f1_stderr\": 0.0012126841041294677\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4909234411996843,\n \"acc_stderr\": 0.0140501700944977\n }\n}\n```", "repo_url": "https://huggingface.co/psyche/kogpt", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T11_08_59.950038", "path": ["**/details_harness|drop|3_2023-10-13T11-08-59.950038.parquet"]}, {"split": "2023_10_14T16_10_56.600667", "path": ["**/details_harness|drop|3_2023-10-14T16-10-56.600667.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T16-10-56.600667.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T11_08_59.950038", "path": ["**/details_harness|gsm8k|5_2023-10-13T11-08-59.950038.parquet"]}, {"split": "2023_10_14T16_10_56.600667", "path": ["**/details_harness|gsm8k|5_2023-10-14T16-10-56.600667.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T16-10-56.600667.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:23:49.331489.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:23:49.331489.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:23:49.331489.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T11_08_59.950038", "path": ["**/details_harness|winogrande|5_2023-10-13T11-08-59.950038.parquet"]}, {"split": "2023_10_14T16_10_56.600667", "path": ["**/details_harness|winogrande|5_2023-10-14T16-10-56.600667.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T16-10-56.600667.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_23_49.331489", "path": ["results_2023-07-19T19:23:49.331489.parquet"]}, {"split": "2023_10_13T11_08_59.950038", "path": ["results_2023-10-13T11-08-59.950038.parquet"]}, {"split": "2023_10_14T16_10_56.600667", "path": ["results_2023-10-14T16-10-56.600667.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T16-10-56.600667.parquet"]}]}]}
2023-10-14T15:11:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of psyche/kogpt ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model psyche/kogpt on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T16:10:56.600667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of psyche/kogpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model psyche/kogpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T16:10:56.600667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of psyche/kogpt", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model psyche/kogpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T16:10:56.600667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 15, 31, 163, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psyche/kogpt## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psyche/kogpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T16:10:56.600667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8a70f719ff72d018a602b1a0f23a2166fda49bd9
# Dataset Card for Evaluation run of psyche/kollama2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/psyche/kollama2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [psyche/kollama2-7b](https://huggingface.co/psyche/kollama2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_psyche__kollama2-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-14T17:15:12.197275](https://huggingface.co/datasets/open-llm-leaderboard/details_psyche__kollama2-7b/blob/main/results_2023-10-14T17-15-12.197275.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.006396812080536913, "em_stderr": 0.0008164468837432388, "f1": 0.07105180369127512, "f1_stderr": 0.0016055423368099692, "acc": 0.3997180423570749, "acc_stderr": 0.009435490911643514 }, "harness|drop|3": { "em": 0.006396812080536913, "em_stderr": 0.0008164468837432388, "f1": 0.07105180369127512, "f1_stderr": 0.0016055423368099692 }, "harness|gsm8k|5": { "acc": 0.05989385898407885, "acc_stderr": 0.006536148151288735 }, "harness|winogrande|5": { "acc": 0.739542225730071, "acc_stderr": 0.012334833671998292 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_psyche__kollama2-7b
[ "region:us" ]
2023-08-17T23:10:30+00:00
{"pretty_name": "Evaluation run of psyche/kollama2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psyche/kollama2-7b](https://huggingface.co/psyche/kollama2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psyche__kollama2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T17:15:12.197275](https://huggingface.co/datasets/open-llm-leaderboard/details_psyche__kollama2-7b/blob/main/results_2023-10-14T17-15-12.197275.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432388,\n \"f1\": 0.07105180369127512,\n \"f1_stderr\": 0.0016055423368099692,\n \"acc\": 0.3997180423570749,\n \"acc_stderr\": 0.009435490911643514\n },\n \"harness|drop|3\": {\n \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432388,\n \"f1\": 0.07105180369127512,\n \"f1_stderr\": 0.0016055423368099692\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05989385898407885,\n \"acc_stderr\": 0.006536148151288735\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998292\n }\n}\n```", "repo_url": "https://huggingface.co/psyche/kollama2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|arc:challenge|25_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T17_15_12.197275", "path": ["**/details_harness|drop|3_2023-10-14T17-15-12.197275.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T17-15-12.197275.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T17_15_12.197275", "path": ["**/details_harness|gsm8k|5_2023-10-14T17-15-12.197275.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T17-15-12.197275.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hellaswag|10_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T11:31:26.569073.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T11:31:26.569073.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T11:31:26.569073.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T17_15_12.197275", "path": ["**/details_harness|winogrande|5_2023-10-14T17-15-12.197275.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T17-15-12.197275.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_26T11_31_26.569073", "path": ["results_2023-07-26T11:31:26.569073.parquet"]}, {"split": "2023_10_14T17_15_12.197275", "path": ["results_2023-10-14T17-15-12.197275.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T17-15-12.197275.parquet"]}]}]}
2023-10-14T16:15:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of psyche/kollama2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model psyche/kollama2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-14T17:15:12.197275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of psyche/kollama2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model psyche/kollama2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T17:15:12.197275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of psyche/kollama2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model psyche/kollama2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-14T17:15:12.197275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 165, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psyche/kollama2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psyche/kollama2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T17:15:12.197275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c048d16b53c98c3865cc500124fadac5153fad49
# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TehVenom/GPT-J-Pyg_PPO-6B](https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T15:30:49.782386](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B/blob/main/results_2023-09-17T15-30-49.782386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119039, "f1": 0.05566380033557068, "f1_stderr": 0.0012986027748823588, "acc": 0.3376248299846874, "acc_stderr": 0.008988978817812654 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119039, "f1": 0.05566380033557068, "f1_stderr": 0.0012986027748823588 }, "harness|gsm8k|5": { "acc": 0.028051554207733132, "acc_stderr": 0.004548229533836348 }, "harness|winogrande|5": { "acc": 0.6471981057616417, "acc_stderr": 0.01342972810178896 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B
[ "region:us" ]
2023-08-17T23:10:38+00:00
{"pretty_name": "Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/GPT-J-Pyg_PPO-6B](https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T15:30:49.782386](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B/blob/main/results_2023-09-17T15-30-49.782386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119039,\n \"f1\": 0.05566380033557068,\n \"f1_stderr\": 0.0012986027748823588,\n \"acc\": 0.3376248299846874,\n \"acc_stderr\": 0.008988978817812654\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119039,\n \"f1\": 0.05566380033557068,\n \"f1_stderr\": 0.0012986027748823588\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \"acc_stderr\": 0.004548229533836348\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.01342972810178896\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T15_30_49.782386", "path": ["**/details_harness|drop|3_2023-09-17T15-30-49.782386.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T15-30-49.782386.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T15_30_49.782386", "path": ["**/details_harness|gsm8k|5_2023-09-17T15-30-49.782386.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T15-30-49.782386.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:06:25.891734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:06:25.891734.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:06:25.891734.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T15_30_49.782386", "path": ["**/details_harness|winogrande|5_2023-09-17T15-30-49.782386.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T15-30-49.782386.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_06_25.891734", "path": ["results_2023-07-19T16:06:25.891734.parquet"]}, {"split": "2023_09_17T15_30_49.782386", "path": ["results_2023-09-17T15-30-49.782386.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T15-30-49.782386.parquet"]}]}]}
2023-09-17T14:31:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T15:30:49.782386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T15:30:49.782386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T15:30:49.782386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T15:30:49.782386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1b26824b85e2d0f68ce20d757c1511e8bfd3994e
# Dataset Card for Evaluation run of TehVenom/Pygmalion-13b-Merged ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TehVenom/Pygmalion-13b-Merged - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TehVenom/Pygmalion-13b-Merged](https://huggingface.co/TehVenom/Pygmalion-13b-Merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TehVenom__Pygmalion-13b-Merged", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T01:54:40.164227](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Pygmalion-13b-Merged/blob/main/results_2023-10-22T01-54-40.164227.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.1669463087248322, "em_stderr": 0.0038191310263365245, "f1": 0.2267313338926176, "f1_stderr": 0.0038570900818293546, "acc": 0.37804284774825825, "acc_stderr": 0.006420137883941132 }, "harness|drop|3": { "em": 0.1669463087248322, "em_stderr": 0.0038191310263365245, "f1": 0.2267313338926176, "f1_stderr": 0.0038570900818293546 }, "harness|gsm8k|5": { "acc": 0.000758150113722517, "acc_stderr": 0.0007581501137225271 }, "harness|winogrande|5": { "acc": 0.755327545382794, "acc_stderr": 0.012082125654159738 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TehVenom__Pygmalion-13b-Merged
[ "region:us" ]
2023-08-17T23:10:47+00:00
{"pretty_name": "Evaluation run of TehVenom/Pygmalion-13b-Merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/Pygmalion-13b-Merged](https://huggingface.co/TehVenom/Pygmalion-13b-Merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__Pygmalion-13b-Merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T01:54:40.164227](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Pygmalion-13b-Merged/blob/main/results_2023-10-22T01-54-40.164227.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1669463087248322,\n \"em_stderr\": 0.0038191310263365245,\n \"f1\": 0.2267313338926176,\n \"f1_stderr\": 0.0038570900818293546,\n \"acc\": 0.37804284774825825,\n \"acc_stderr\": 0.006420137883941132\n },\n \"harness|drop|3\": {\n \"em\": 0.1669463087248322,\n \"em_stderr\": 0.0038191310263365245,\n \"f1\": 0.2267313338926176,\n \"f1_stderr\": 0.0038570900818293546\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225271\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/Pygmalion-13b-Merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T01_54_40.164227", "path": ["**/details_harness|drop|3_2023-10-22T01-54-40.164227.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T01-54-40.164227.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T01_54_40.164227", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-54-40.164227.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-54-40.164227.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:39:54.874893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:39:54.874893.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:39:54.874893.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T01_54_40.164227", "path": ["**/details_harness|winogrande|5_2023-10-22T01-54-40.164227.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T01-54-40.164227.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_39_54.874893", "path": ["results_2023-07-19T18:39:54.874893.parquet"]}, {"split": "2023_10_22T01_54_40.164227", "path": ["results_2023-10-22T01-54-40.164227.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T01-54-40.164227.parquet"]}]}]}
2023-10-22T00:54:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TehVenom/Pygmalion-13b-Merged ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TehVenom/Pygmalion-13b-Merged on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-22T01:54:40.164227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TehVenom/Pygmalion-13b-Merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Pygmalion-13b-Merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T01:54:40.164227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TehVenom/Pygmalion-13b-Merged", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Pygmalion-13b-Merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-22T01:54:40.164227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/Pygmalion-13b-Merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Pygmalion-13b-Merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T01:54:40.164227(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
05debf2071ae2749f5c095b85d858eba7c0e6091
# Dataset Card for Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TehVenom/DiffMerge-DollyGPT-Pygmalion](https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-17T02:27:34.673978](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion/blob/main/results_2023-09-17T02-27-34.673978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.03261325503355705, "em_stderr": 0.0018190171380944463, "f1": 0.06326342281879199, "f1_stderr": 0.0020903684000438045, "acc": 0.2691397000789266, "acc_stderr": 0.007005621297482058 }, "harness|drop|3": { "em": 0.03261325503355705, "em_stderr": 0.0018190171380944463, "f1": 0.06326342281879199, "f1_stderr": 0.0020903684000438045 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5382794001578532, "acc_stderr": 0.014011242594964116 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion
[ "region:us" ]
2023-08-17T23:10:56+00:00
{"pretty_name": "Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/DiffMerge-DollyGPT-Pygmalion](https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T02:27:34.673978](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__DiffMerge-DollyGPT-Pygmalion/blob/main/results_2023-09-17T02-27-34.673978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03261325503355705,\n \"em_stderr\": 0.0018190171380944463,\n \"f1\": 0.06326342281879199,\n \"f1_stderr\": 0.0020903684000438045,\n \"acc\": 0.2691397000789266,\n \"acc_stderr\": 0.007005621297482058\n },\n \"harness|drop|3\": {\n \"em\": 0.03261325503355705,\n \"em_stderr\": 0.0018190171380944463,\n \"f1\": 0.06326342281879199,\n \"f1_stderr\": 0.0020903684000438045\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5382794001578532,\n \"acc_stderr\": 0.014011242594964116\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/DiffMerge-DollyGPT-Pygmalion", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T02_27_34.673978", "path": ["**/details_harness|drop|3_2023-09-17T02-27-34.673978.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T02-27-34.673978.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T02_27_34.673978", "path": ["**/details_harness|gsm8k|5_2023-09-17T02-27-34.673978.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T02-27-34.673978.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:25.524586.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:25.524586.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T02_27_34.673978", "path": ["**/details_harness|winogrande|5_2023-09-17T02-27-34.673978.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T02-27-34.673978.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_29_25.524586", "path": ["results_2023-07-19T19:29:25.524586.parquet"]}, {"split": "2023_09_17T02_27_34.673978", "path": ["results_2023-09-17T02-27-34.673978.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T02-27-34.673978.parquet"]}]}]}
2023-09-17T01:27:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TehVenom/DiffMerge-DollyGPT-Pygmalion on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-17T02:27:34.673978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/DiffMerge-DollyGPT-Pygmalion on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T02:27:34.673978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/DiffMerge-DollyGPT-Pygmalion on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-17T02:27:34.673978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/DiffMerge-DollyGPT-Pygmalion## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/DiffMerge-DollyGPT-Pygmalion on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T02:27:34.673978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
7cf782dd3d98ad81cf56d65341bd469c835861ee
# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4](https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-18T06:55:06.084057](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4/blob/main/results_2023-10-18T06-55-06.084057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0016778523489932886, "em_stderr": 0.0004191330178826871, "f1": 0.058036912751678014, "f1_stderr": 0.001339441597906354, "acc": 0.3295242323804896, "acc_stderr": 0.008622843965649133 }, "harness|drop|3": { "em": 0.0016778523489932886, "em_stderr": 0.0004191330178826871, "f1": 0.058036912751678014, "f1_stderr": 0.001339441597906354 }, "harness|gsm8k|5": { "acc": 0.018953752843062926, "acc_stderr": 0.0037560783410314704 }, "harness|winogrande|5": { "acc": 0.6400947119179163, "acc_stderr": 0.013489609590266795 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4
[ "region:us" ]
2023-08-17T23:11:05+00:00
{"pretty_name": "Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4](https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T06:55:06.084057](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4/blob/main/results_2023-10-18T06-55-06.084057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826871,\n \"f1\": 0.058036912751678014,\n \"f1_stderr\": 0.001339441597906354,\n \"acc\": 0.3295242323804896,\n \"acc_stderr\": 0.008622843965649133\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826871,\n \"f1\": 0.058036912751678014,\n \"f1_stderr\": 0.001339441597906354\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.018953752843062926,\n \"acc_stderr\": 0.0037560783410314704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6400947119179163,\n \"acc_stderr\": 0.013489609590266795\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T06_55_06.084057", "path": ["**/details_harness|drop|3_2023-10-18T06-55-06.084057.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T06-55-06.084057.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T06_55_06.084057", "path": ["**/details_harness|gsm8k|5_2023-10-18T06-55-06.084057.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T06-55-06.084057.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:54:40.304544.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:54:40.304544.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T06_55_06.084057", "path": ["**/details_harness|winogrande|5_2023-10-18T06-55-06.084057.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T06-55-06.084057.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_54_40.304544", "path": ["results_2023-07-19T15:54:40.304544.parquet"]}, {"split": "2023_10_18T06_55_06.084057", "path": ["results_2023-10-18T06-55-06.084057.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T06-55-06.084057.parquet"]}]}]}
2023-10-18T05:55:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-18T06:55:06.084057(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T06:55:06.084057(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-18T06:55:06.084057(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 32, 31, 180, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T06:55:06.084057(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e4da4a9df440e2d6c24c0e8396334860c7821041
# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-V8p4_Dev-6b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TehVenom/PPO_Shygmalion-V8p4_Dev-6b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TehVenom/PPO_Shygmalion-V8p4_Dev-6b](https://huggingface.co/TehVenom/PPO_Shygmalion-V8p4_Dev-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-V8p4_Dev-6b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-19T06:04:59.904183](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-V8p4_Dev-6b/blob/main/results_2023-10-19T06-04-59.904183.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.001363255033557047, "em_stderr": 0.00037786091964606367, "f1": 0.05447671979865783, "f1_stderr": 0.001290676035215164, "acc": 0.3325412749009229, "acc_stderr": 0.00893055367392792 }, "harness|drop|3": { "em": 0.001363255033557047, "em_stderr": 0.00037786091964606367, "f1": 0.05447671979865783, "f1_stderr": 0.001290676035215164 }, "harness|gsm8k|5": { "acc": 0.02577710386656558, "acc_stderr": 0.004365042953621815 }, "harness|winogrande|5": { "acc": 0.6393054459352802, "acc_stderr": 0.013496064394234024 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-V8p4_Dev-6b
[ "region:us" ]
2023-08-17T23:11:14+00:00
{"pretty_name": "Evaluation run of TehVenom/PPO_Shygmalion-V8p4_Dev-6b", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/PPO_Shygmalion-V8p4_Dev-6b](https://huggingface.co/TehVenom/PPO_Shygmalion-V8p4_Dev-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-V8p4_Dev-6b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T06:04:59.904183](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-V8p4_Dev-6b/blob/main/results_2023-10-19T06-04-59.904183.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606367,\n \"f1\": 0.05447671979865783,\n \"f1_stderr\": 0.001290676035215164,\n \"acc\": 0.3325412749009229,\n \"acc_stderr\": 0.00893055367392792\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606367,\n \"f1\": 0.05447671979865783,\n \"f1_stderr\": 0.001290676035215164\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \"acc_stderr\": 0.004365042953621815\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6393054459352802,\n \"acc_stderr\": 0.013496064394234024\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/PPO_Shygmalion-V8p4_Dev-6b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T06_04_59.904183", "path": ["**/details_harness|drop|3_2023-10-19T06-04-59.904183.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T06-04-59.904183.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T06_04_59.904183", "path": ["**/details_harness|gsm8k|5_2023-10-19T06-04-59.904183.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T06-04-59.904183.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:59:40.627268.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:59:40.627268.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:59:40.627268.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T06_04_59.904183", "path": ["**/details_harness|winogrande|5_2023-10-19T06-04-59.904183.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T06-04-59.904183.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_59_40.627268", "path": ["results_2023-07-19T15:59:40.627268.parquet"]}, {"split": "2023_10_19T06_04_59.904183", "path": ["results_2023-10-19T06-04-59.904183.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T06-04-59.904183.parquet"]}]}]}
2023-10-19T05:05:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-V8p4_Dev-6b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-V8p4_Dev-6b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-19T06:04:59.904183(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-V8p4_Dev-6b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-V8p4_Dev-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T06:04:59.904183(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-V8p4_Dev-6b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-V8p4_Dev-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-19T06:04:59.904183(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-V8p4_Dev-6b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-V8p4_Dev-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T06:04:59.904183(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1be19f0d7f6d31e8968e01b7244c898bf2507e68
# Dataset Card for Evaluation run of TehVenom/Dolly_Shygmalion-6b-Dev_V8P2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TehVenom/Dolly_Shygmalion-6b-Dev_V8P2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TehVenom/Dolly_Shygmalion-6b-Dev_V8P2](https://huggingface.co/TehVenom/Dolly_Shygmalion-6b-Dev_V8P2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TehVenom__Dolly_Shygmalion-6b-Dev_V8P2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-12T17:41:38.802429](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Dolly_Shygmalion-6b-Dev_V8P2/blob/main/results_2023-10-12T17-41-38.802429.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119019, "f1": 0.05151635906040278, "f1_stderr": 0.0012498327026492658, "acc": 0.3314818394026232, "acc_stderr": 0.008646188468382226 }, "harness|drop|3": { "em": 0.0012583892617449664, "em_stderr": 0.0003630560893119019, "f1": 0.05151635906040278, "f1_stderr": 0.0012498327026492658 }, "harness|gsm8k|5": { "acc": 0.019711902956785442, "acc_stderr": 0.0038289829787357212 }, "harness|winogrande|5": { "acc": 0.6432517758484609, "acc_stderr": 0.013463393958028732 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TehVenom__Dolly_Shygmalion-6b-Dev_V8P2
[ "region:us" ]
2023-08-17T23:11:23+00:00
{"pretty_name": "Evaluation run of TehVenom/Dolly_Shygmalion-6b-Dev_V8P2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/Dolly_Shygmalion-6b-Dev_V8P2](https://huggingface.co/TehVenom/Dolly_Shygmalion-6b-Dev_V8P2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__Dolly_Shygmalion-6b-Dev_V8P2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T17:41:38.802429](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Dolly_Shygmalion-6b-Dev_V8P2/blob/main/results_2023-10-12T17-41-38.802429.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119019,\n \"f1\": 0.05151635906040278,\n \"f1_stderr\": 0.0012498327026492658,\n \"acc\": 0.3314818394026232,\n \"acc_stderr\": 0.008646188468382226\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119019,\n \"f1\": 0.05151635906040278,\n \"f1_stderr\": 0.0012498327026492658\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \"acc_stderr\": 0.0038289829787357212\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6432517758484609,\n \"acc_stderr\": 0.013463393958028732\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/Dolly_Shygmalion-6b-Dev_V8P2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T17_41_38.802429", "path": ["**/details_harness|drop|3_2023-10-12T17-41-38.802429.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T17-41-38.802429.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T17_41_38.802429", "path": ["**/details_harness|gsm8k|5_2023-10-12T17-41-38.802429.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T17-41-38.802429.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:53:13.487601.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:53:13.487601.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:53:13.487601.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T17_41_38.802429", "path": ["**/details_harness|winogrande|5_2023-10-12T17-41-38.802429.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T17-41-38.802429.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_53_13.487601", "path": ["results_2023-07-19T15:53:13.487601.parquet"]}, {"split": "2023_10_12T17_41_38.802429", "path": ["results_2023-10-12T17-41-38.802429.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T17-41-38.802429.parquet"]}]}]}
2023-10-12T16:41:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TehVenom/Dolly_Shygmalion-6b-Dev_V8P2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TehVenom/Dolly_Shygmalion-6b-Dev_V8P2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-12T17:41:38.802429(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TehVenom/Dolly_Shygmalion-6b-Dev_V8P2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Dolly_Shygmalion-6b-Dev_V8P2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T17:41:38.802429(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TehVenom/Dolly_Shygmalion-6b-Dev_V8P2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Dolly_Shygmalion-6b-Dev_V8P2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-12T17:41:38.802429(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 177, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/Dolly_Shygmalion-6b-Dev_V8P2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Dolly_Shygmalion-6b-Dev_V8P2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T17:41:38.802429(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]